I'm studying in your country, and I come from a country that doesn't have many Christians. I want to understand your beliefs. What do Christians believe? Can you suggest a book I can read to help me understand what Christians believe? — Z.W.
The Christian faith is centered in Jesus Christ, who lived in the Middle East about 2,000 years ago. According to the Bible (which is the source of our information about Jesus), He became a traveling preacher and teacher when He was about 30 years old, and was put to death by His enemies about three years later.
But the Bible tells us that Jesus was more than a great religious teacher. The Bible says that He was God in human flesh -- not just a godly person, but actually God as well as a man. His virgin birth, His miracles, His wisdom, and especially His resurrection from the dead and ascension into Heaven -- all attest to His unique nature. The Bible puts it this way: "In Christ all the fullness of the Deity lives in bodily form" (Colossians 2:9).
Why did God come down to earth in this way? He came so we could know what He is like -- especially in His love for us. But He came mainly to tear down the barrier between us and God -- a barrier caused by our sin. He did this by becoming the complete and final sacrifice for our sins through His death on the cross.
I invite you to discover Jesus for yourself by getting a Bible and reading about Him in the Gospels (the first four books in the New Testament). When you do, you'll discover the greatest truth of all -- that God loves you, and He wants you to come to know Him and become part of His family forever.