Does the Bible tell us all we need to know about God? I had some people come to my door the other day talking about their religion, and they said we need another book in addition to the Bible. — M.R.
I don't doubt their sincerity, but as a Christian I cannot agree with their viewpoint. For almost two thousand years Christians have agreed that the Bible alone is God's Word, and that it tells us everything we need to know about Him.
Why do they believe this? First because of the Bible's testimony about itself. For example, hundreds of times in the Old Testament the prophets declared that they were speaking God's Word, not the words of men. As the Apostle Peter wrote, "Prophecy never had its origin in the will of man, but men spoke from God as they were carried along by the Holy Spirit" (2 Peter 1:21).
The second reason is in some ways more important: The Bible tells us that Christ's work for our salvation is complete. We do not need any further revelation about God's plan of salvation, for by His death and resurrection Jesus did everything necessary to bring salvation to us. The Bible says God "has given us everything we need for life and godliness" (2 Peter 1:3).
Don't be misled by those who don't understand who Jesus is or what He has done for us. Instead, by faith turn to Christ and trust Him alone for your salvation. The Bible promises, "If you confess with your mouth, 'Jesus is Lord,' and believe in your heart that God raised him from the dead, you will be saved" (Romans 10:9).