I was raised in a church all my life. My parents were strict Wesleyans. I embraced Christianity for many years, but never fully had faith. I never understood the rational. I was taught that Christianity was the only way, and that every other religion was wrong. Actually, I was pretty much taught that I'd go to hell if I even questioned what I was being taught in the church (Guess I'd better just quit while I'm ahead?). But then again, I always had to figure things out for myself, the hard way or not.
So I have a few questions for any of you out there professing a relationship with Jesus Christ. I am open-minded to hear your arguments.
So why exactly is Christianity the right religion? Why is every other religion wrong? Who's to say that religion isn't just one big, fabricated lie? And most of all, why should I even care?
2007-11-06
12:30:11
·
15 answers
·
asked by
boink
2
in
Religion & Spirituality