I'm not just trying to be nasty here I'm hoping to make a serious point.
It seems to me that most (though obviously not all) people from the USA seem to be incredibly ignorant when it comes to other countries. I don't understand, don't you have geography lessons at school? If you do, do they teach you anything about the world outside America? The thing that annoys me the most is the way Americans actually believe in all the stereotypes when it comes to British people or French people or whatever. But it really isn't the same the other way round. I think that on average British people (for example) could tell you more about the United States than an American could tell you about the United Kingdom. But why is this? What is wrong with the American education system?
Also why do so many people on this website seem to think that this website is only used by Americans? There are many other people using it too so how about you try considering that when you ask questions
2006-06-26
07:56:20
·
19 answers
·
asked by
Anonymous
in
Other - Society & Culture