Some people came to our door the other day to talk about their religion, and we found it very interesting. They say the Christian church has hidden the full truth about God, and now God has chosen them to reveal it to the world. Could they be right? — Mrs. E.N.
I don't doubt their sincerity or their zeal for what they believe to be the truth; I wish more of us who are Christians had their level of commitment. But no, as a Christian, I cannot agree with the ideas you have outlined. The reason is because I base my faith on the Bible, which God gave us to tell us the truth about Himself. It tells us all we need to know about God—and most of all, it points us to Jesus Christ, who was God in human flesh. This is why the Bible says, "For no one can lay any foundation other than the one already laid, which is Jesus Christ" (1 Corinthians 3:11).
Jesus was God's final and complete revelation of Himself, and for almost 2,000 years the Church has looked to the Bible to tell us about Him. That's what I pray you will do, and that you won't be misled by anyone who twists the Bible's clear meaning. The Bible "never had its origin in the will of man, but men spoke from God as they were carried along by the Holy Spirit" (2 Peter 1:21).
Most of all, I pray that you will turn to God and confess your sins to Him and then open your heart and life to Jesus Christ. He alone can save you from your sins and He alone can reconcile you to the Father. By a simple prayer of faith invite him into your life today.