or should I say Christians, balance the beliefs of human rights and the Christian way of life? There are just some things that I think people have the right to do, like gay relations, but don't agree with God's word. The Bible was written by humans afterall, and one of the disiples were sexist, so how can we completely trust that what they say happened, actually happened? And what about all the stuff that was supposed to be in the Bible but the Romans decided not to put it in there? These Biblical myths that most Christians refuse to believe, make more sense to me in the end. They fill in a lot of the blanks. Just os everyone knows, I'm not asking this to be ignorant, I really do want to know. I'm trying so hard to understand, but the gaps keep screaming at me, I need more (and better) answers. I truely want the truth. Gee, maybe I should take Bible study classes at my college.
2007-12-22
13:37:47
·
22 answers
·
asked by
Jen. E
2