It seems to me that the stories in the bible should not be taken literally, especially as an entire lifestyle. Isn't it more important to believe the lessons that stories in the bible taught rather than the actual events (Which many we have proved to be false)?
Of course, stupid parts from the bible like calling homosexuality an "abomination" should definitely not be taken literally (Or in our modern government).
Even the Resurrection of Jesus Christ, the most important element of the Christian religion, should not be acknowledged as an actual event. Christians focus deeply on how Christ died, when they should be paying more attention to how he lived. Although crazy for being the self-proclaimed "son of God", he was a good person.
Although much of the bible is quite ridiculous, it was written with good intentions to teach lessons. Jesus preached tolerance and accepting, but right wing Christians have no problems with gay-bashing.
2007-04-10
10:55:31
·
17 answers
·
asked by
Anonymous
in
Religion & Spirituality