English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Is the West but an embodyment of white racism in our time? Recently we have seen the tossing around of terms like "the west" to describe America and other allies. This West is an idealogiy. A world set a apart from all others based on its superiority. This is nothing new, whites have come up with terms for 100's of years to separate them from the rest of US in the world. On one hand the West is all that is pure, full of justice, and intelligence. All else is fanatical, radical, and backwards. Ironically this is the way whites deemed themselves during slavery, they were purer, more righteous, and more intelligent than those under them. Thus they could subjugate the blacks and indians because they were always in the "right" and could do no wrong. Racism based on ethnicicity has merely transformed into racism based on culture and civilization. The "West" does not even exist, we are one world. For instance Israel would be consider the Western World, yet look on a map. Why is there no non white country apart of this west? The West is a mere state of mind of white superiority, and white power at the expense of others. I have even seen people on television deeming the conflict of the Middle East as a war between the west and "them". Them so lowly that they do not have a name, but whites are so holy that they can devote an actual term to describe themselves. Thus the West, no one has heard of the east, south, or north because apparently they simply are not worth enough to be called anything at all! Then other times they will describe the conflicts as civilization vs the uncivilized. Pure racism, whites have always felt that they wer emore cilivized than the rest of us, and this is nothing new. Yet they fail to even realize that people in that part of the world are just like everyone else, but because they dont fall into the white view of things they are inferior! Even posters on this site have described people outside of the "west" like they were animals! One man on here said that the Arabs have nothing of their own and they steal everything from Europe. This sounds like white supremacy, that nobody outside of Europe has any brains what so ever and everything revolves around Europe. He went as far as to say that Arabs had nothing. If he feels this way about Arabs what does he think of Africa and the rest of Asia? Pure ignorance, and it is not one man it is all of you. And the sad is that you don't even realize it.

2006-08-09 05:58:22 · 6 answers · asked by Anonymous in Politics & Government Politics

6 answers

...and learn to write and spell.

2006-08-09 06:20:55 · answer #1 · answered by rustyshackleford001 5 · 1 0

That was way too much to read, but no I don't think the west is racist.

2006-08-09 13:02:41 · answer #2 · answered by Phil My Crack In 4 · 2 0

Get a grip (on reality) before you go completely bananas!!!

2006-08-09 13:03:09 · answer #3 · answered by Walter Ridgeley 5 · 1 0

The West is not racist, but the East is!!!!

2006-08-09 13:06:01 · answer #4 · answered by Vagabond5879 7 · 2 0

Blah, Blah, Blah, go away.

2006-08-09 14:39:04 · answer #5 · answered by Anonymous · 1 1

and your point is.

2006-08-09 13:04:38 · answer #6 · answered by duc602 7 · 1 0

fedest.com, questions and answers