English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

5 answers

No. Not intentionally. Parents teach racism.

2006-08-23 16:35:16 · answer #1 · answered by da_hammerhead 6 · 0 0

If they do I dont think its intentionally..

The government promote racism by giving blacks etc benefits that the white man isnt entitled too. Go and ask most teenagers if they have a problem with other races and if they say yes, I can guarantee they will say ITS NOT FAIR THEY GET BENEFITS etc.. So I believe that the government promotes racism by segregating races like this!
Also.. The race itself promotes racism with its stand up comedians etc.. Every black comedian talks about white ppl in tehre act etc.. Its boring and its not funny! Id like to see a white comedian say something about a black person.. I can guarantee he would get booed off stage!
So Government and the race itself promote racism!

2006-08-23 23:40:15 · answer #2 · answered by Jade H 3 · 0 0

I don't know what goes on in all schools, but in my experience school staff and teachers usually go out of their way to be politically correct. They have to take seminars about societal conditions and problems and stay up-to-date on issues like racism, immigration/language, gangs, drugs, teenage pregnancy, AIDS, etc.

2006-08-23 23:40:45 · answer #3 · answered by wlmssb 3 · 0 0

Some do, but not all of them.

2006-08-23 23:34:52 · answer #4 · answered by Andrea 5 · 0 0

Not that I know about.

2006-08-23 23:35:37 · answer #5 · answered by weswe 5 · 0 0

fedest.com, questions and answers