English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2006-12-21 13:17:11 · 2 answers · asked by Christopher L 1 in Politics & Government Politics

2 answers

Africa is a CONTINENT not a country. There are many countries in Africa. The united States probably has trade agreements with many of those countries. Decide which country you want to know about, then research it.

2006-12-21 13:20:39 · answer #1 · answered by Anonymous · 0 0

There is some irony here. The U.S. (and the West in general), through international institutions such as the IMF and World Bank, encourage African countries to lower trade barriers; otherwise, it will become more difficult for African countries to obtain loans and aid. Western countries, however, insist on maintaining high trade barriers on things such as agricultural goods. Africa can produce crops cheaper than the U.S., but they cannot effectively sell in western markets because of high tariffs. So the West uses its economic clout to force African countries to allow free trade on their end, but do not reciprocate. Trade with Africa is certainly not free, nor is it fair.

2006-12-21 21:44:29 · answer #2 · answered by Ape Ape Man 4 · 0 0

fedest.com, questions and answers