I just finished a book by Sam Harris entitled "The End of Faith". Great book, and I encourage anyone with an open mind to check it out. The theory he poses is this:
If we don't do away with organized religion in modern society, it will inevitably be the end of us. Organized religion appears to be on the forefront of many of the major conflicts in the world right now, and ancient esoteric views don't seem to lend themselves to paradigm shifts very readily. In short, we're never all going to "just get along" unless we abandon these outdated worldviews.
I guess another way of putting it would be, if there were no Christians and Muslims, would we be fighting a "War on Terror"? What we appear to be fighting is a war on Islam, and they in turn a war on Christianity. The Islamic texts refer to any non-Muslim as an "Enemy of God", and a basic tennet of Islam is that the word of God (Allah) is supreme and to be taken as law. Thoughts?
2006-07-13
13:01:23
·
9 answers
·
asked by
jaxmiry
2
in
Religion & Spirituality