This may seem like a very alarming question, but let me explain. I am beginning to think that teaching and focusing on American history throughout a child's education can be harmful to the ideal of Americanism - freedom. If a child becomes very nationalistic and regionalistic, and sees all from the view of America the victor, when they encounter children and immigrants from different cultures, will they see themselves as Americans and superior? I guess I can not explain this as well as I percieve the phenomenon, but do you think that we should focus more on teaching children world history, or is american history the best way to go? Would more world history benefit our politicians and leaders?
2007-11-13
17:41:13
·
20 answers
·
asked by
Jack
2