I am an American who has done alot of international traveling. I notice one thing for sure in every single country I have visited and lived in; White ppl from all over have a very arrogant ethnocentric attitude.
In the USA its obvious to any American, all over Africa Europeans demand 1st class status, and have forcibly put the native ppl in a 2nd class, in Australia the aborigines have been displaced, and are in a constant struggle for simple human rights on their own land. I currently live in the UAE, and I always hear white europeans say things like, 'the natives need to change their culture', and that "we (the europeans) need to be given citizenship", and all the benefits that come with it but have no respect for the ppl of this country and their way of life, and are not shy about saying it. I find it quite perplexing and crazy. Is this something that is taught from generation to generation, or is it just how you think? Please do not respond out of emotion, I really want to know
2007-11-14
02:25:09
·
13 answers
·
asked by
avgwhiteguy
1
in
Other - Cultures & Groups