When I was in high school, the world history textbooks had a lot more to say about Christianity, Judaism and Islam. The only thing they wrote about Hinduism was about the caste system, which gives everyone this negative view. Even the teachers knew more about Islam and Christianity, why?
They twist some things around to make some people look better and justify their actions, like Muslims spreading the religion, they don't mention how Buddhist and Hindu temples were destroyed.
I'm an agnostic
2006-12-17
03:11:49
·
7 answers
·
asked by
Anonymous
in
Society & Culture
➔ Religion & Spirituality
They still teach the Aryan Invasion Theory, which isn't true, debunked
They spend an entire chapter on the holocaust, which is justifiable, but nothing on the massacre in China, it is world history, not just European history
And they don't have much to say on Africa or South America
2006-12-17
03:17:56 ·
update #1
i am talking about WORLD history, and America was not founded as a Christian country, it was founded by people escaping religious persecution, you have no right to say America is a Christian nation
But all I'm saying is that I've noticed how everything is about Europe, on a map, Europe is in the center with Africa below and Asia to the East, it could be anyway, but most of the time, it's like that, If a country like America is going to intill biases, then there is no hope for countries in the world where freedom and equality are so limited
2006-12-17
03:26:56 ·
update #2
Yeah, I'm American
2006-12-17
03:29:09 ·
update #3