English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

for me, ever since i was 4, i've learnt that africa is a very diverse continent made up of blacks, whites, asians, disabled, young and elderly people, hetrosexuals, homosexuals; all types of people. yet why is that when africa is mentioned, particularly in the media and the news, we constantly being fed images of just blacks and whites, and that black people are regularly and negatively depicted as being orphans, poor and helpless when it is not always the case? because for me, i see africa, as like those other continents as being ethnically and socially diversive as europe and the americas to name. although where it does differ is the economic and financial structure, which it seems to lack in certain parts.

what are your general views on this question that i've posed here? are you fed up of the media and certain people who insist africa is a country? of which it is not. what about the negative stereotypes that we are shown on tv and in the newspapers? what do you make of those?

2006-10-10 09:14:41 · 4 answers · asked by Anonymous in Society & Culture Cultures & Groups Other - Cultures & Groups

as ever, i don't want any derogatory or racist comments, and please refrain from using offensive language. thanks!

2006-10-10 09:17:24 · update #1

4 answers

As an African living in the US,I would tell you that it is a perception that the Americans still have.They bought people as slaves from Africa that is why they still consider Africa as not amounting to anything more than poverty,violence and disease.
I have live in Europe,I tell you it is different there.Blacks are respected and recognized as a People with ideas and hard working.
But in the states,where the media is still controlled by the slave masters,they have nothing to report about Africa other than the obvious.To me,this is beyond waiting for trouble to break out and then deliver rice with your name on the bag just to prove that you care.Why not show you care by reporting to the world accurately the problem the continent is facing?
I think the practice is absurd,degrading and dehumanising.
Can you imagine that in order to get news from Africa,I have to tune to BBC Africa?
But if you visit African cities today,everybody wants the latest news from America.So I wonder,why be eager to know about those who do not want you?

2006-10-10 09:52:22 · answer #1 · answered by Robert L 1 · 4 0

A continent. I remember thinking it was a country. The reason is we are only given a minute history of Africa and only in terms of we (white America) got our black slaves from there. So no, most don't realize that EVERY race you can think of migrated to and from there.

BUT generally speaking, it is the blacks of South Saharan Africa who are is desperate need of our attention therefore get the most of it .

2006-10-10 16:21:18 · answer #2 · answered by Anonymous · 2 0

I always usually saw white or black ppl in movies and stuff.

2006-10-10 17:55:37 · answer #3 · answered by Nicholais S 6 · 1 0

Some people are very ignorant, they will never know anything beyond their ignorance. A fool is ignorant.

2006-10-10 16:32:28 · answer #4 · answered by thorn 2 · 2 0

fedest.com, questions and answers