I don't know how many comments I've read that say something like: Belief in a god only leads to sexisim, homophobia, irrational thinking, repression of female power, wars, inquisition, more child abuse, rape due to repression and so on.
In the Christian belief, these things are seen as extremely wrong and aren't part of believing in God at all. In fact, it teaches to do the quite the opposite to avoid all of those. Without something teaching against it we would have no order and it would all run rampant in society and no one would think that it was wrong.
I acknowledge that these are simply my opinions and views, but it just doesn't make sense to me.
2007-02-11
20:34:30
·
15 answers
·
asked by
Stiny
2
in
Religion & Spirituality