When I was in high school, the world history textbooks had a lot more to say about Christianity, Judaism and Islam. The only thing they wrote about Hinduism was about the caste system, which gives everyone this negative view. Even the teachers knew more about Islam and Christianity, why?
They twist some things around to make some people look better and justify their actions, like Muslims spreading the religion, they don't mention how Buddhist and Hindu temples were destroyed.
I'm an agnostic
2006-12-17
03:11:49
·
7 answers
·
asked by
Anonymous
in
Religion & Spirituality