English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

10 answers

What is happening in Darfur is truly horrific. Genocide has happened all over the world. We all the know the story of Nazi Germany. Africa is not the dark contintent. There has been so much stolen from Africa it really is shameful. In fact it is the richest contintent on earth. More oil, diamonds that form naturaly, and minerals than the rest of the continents combined. The media just wants you to think that Africans are just over there running wild, killing and raping each other and spreading AIDS. That is not at all the whole truth. Africa is a great country. Poor, but great none the less. People work hard there. Their children are smarter than Americans and it shows when they come to America how much they have been taught to value their education. I resent the media's portrayal of Africa as the land of dark savages. Its about time they gave Africa some positive coverage and some UN AID MONEY!!

2006-10-24 09:48:45 · answer #1 · answered by vanity planning 2 · 0 0

1. Bad news sells!!!- There is nothing dark about Africa. That was a misconception about the continent and a picture that seems to stick since most of the people are 'black'. There are alot of positive images look for some and you will get them. No one should make you believe that the people in Africa are an unhappy, poor bunch!! There is good in the continent that people just choose to ignore.

2. Seeing is believing - go to Africa and see for yourself! If it was one of the saddest places why would people go on safaris, honeymoons, child birth (Pitt & Jolie) etc etc

3. Protectionist attitude- We always tend to believe that there is no place better than 'home'. Well, how would you know if you have not been outside your home?

2006-10-24 17:50:54 · answer #2 · answered by Mwajuma 2 · 0 0

I can't believe some of the stupid answers your getting.

As for dolt #1
Before the various nations in Europe (Portgical, France, England, etc.), allowed their colonies in Africa independence the African continent, as a whole, was thriving. The burgeoning African economy drew workers, and investors form all over the world. The countries in Europe didn’t build the African economy out of the goodness of their heart, obviously their labor forces were also building a strong economic Europe. The work, for the laborers was hard and the hours were long, but hunger and roving bands of murdering thugs were unheard of.

As for dolt #2.
After independence was granted to the individual African colonies the indigenous tribes of each country vied for supremacy, and control of the strong economy of a particular country. This struggle was not peaceful. It didn't take long for the warlords to wage a bloody civil war within their respective countries. As chaos ensued their economies deteriated as one despot after another murdered his way to the top spot.

Africa is, and has always been, very rich in natural resources. Oil, tin, Manganese, copper, gold, diamonds, uranium, tobacco, coffee, are just a few of the resources I can recall off the top of my head. There are many others. Aid from the US is not what is needed in Africa (even though most countries in Africa have huge debts.) What is needed is for the people of Africa, and the European leaders to re-instate control in these countries. The US cannot be expected to be the world’s police force; especially when it gets little or no help from others in the world. The gonad less UN should have been all over the mess in Africa years ago.

2006-10-24 10:34:05 · answer #3 · answered by abono11746 4 · 0 0

It's called 'sensationalism' The media reports the most tragic and calamitous stories. Take a look at news stories about any continent and you'll see that the majority of the news stories are about the bad things that happen.

2006-10-24 09:47:58 · answer #4 · answered by Mo the treehugger! 2 · 1 0

What exactly is your concern? Are you looking for research material or to make a difference. My answer addresses both answers.

Stop reading, if you are that concerned....get your passport, airline ticket, and funding in order, and go see for yourself.

I am an American who has lived in West Africa and the Middle East for 6 years or so. When I visit the US, I can't believe what is portrayed on the news. It is so....one sided....

Maybe I am am wrong, but instead of asking a question here...get a ticket and go....see for yourself.

2006-10-24 10:10:16 · answer #5 · answered by Jennifer E 2 · 1 0

They seemed to have been colonized and taken advantage of their whole existence. It seems as the societies that branched form Africa evolved more quickly and now because of their low technology and GDP they cannot grow properly because they are so poverty stricken.

2006-10-24 09:46:17 · answer #6 · answered by Ruffus Mcghee 2 · 1 1

Not really, it just is that tragedies and calamities makes interesting reading and sell newspapers.

2006-10-24 18:32:22 · answer #7 · answered by Anonymous · 0 0

They get no US help because there's not much oil under Africa.

2006-10-24 09:46:53 · answer #8 · answered by The Indigo Cobra 4 · 0 2

Because the media only lets you see what they want you to see.

2006-10-24 20:14:24 · answer #9 · answered by cgroenewald_2000 4 · 0 0

Because the world is afraid of what is different .

2006-10-24 10:44:45 · answer #10 · answered by Dir33 4 · 0 0

fedest.com, questions and answers