Upon doing some reading (following an accidental viewing of a few moments of one of the "Left Behind" movies), I've learned that Christians fear the idea of a religion that unites the entire world. In fact, they associate this concept with the anti-christ and the end of time. Why?
If there were a "one true faith", wouldn't it most likely be unifying, something that all of mankind could come together under, instead of something that causes war and conflict?
Why is this idea so scary?
By the way -- I don't think there should be an established faith. I'm a big fan of freedom of religion. I just want to discuss why the idea of the whole world coming together, unified through a belief, should be so frightening. So don't get up in arms about freedom of religion, you're preaching to the choir!
2007-08-20
07:02:19
·
14 answers
·
asked by
Anonymous
in
Religion & Spirituality