You've already got some good answers below. Now, read a fresh perspective on the BENEFICIAL side of colonialism, written by Dinesh D'Sousa.
If you read this article, you will know more than anyone else in your class.
http://chronicle.com/free/v48/i35/35b00701.htm
2006-06-24 10:44:59
·
answer #1
·
answered by pachl@sbcglobal.net 7
·
0⤊
0⤋
European imperialism was going on during the 1800's and early 1900's. World War 2 was really the end of most European imperialism and between the world wars, the British and French drew most of the borders for countries all over the world.
2006-06-24 09:51:30
·
answer #2
·
answered by AJaxx 1
·
0⤊
0⤋
Imperialism has been occuring since the 1500's when the spanish and portuguese came in and started colonizing the Americas/Asia. By 1900, most of the land was under some sort of European control (except the American continents). After WWII, most of the colonies gained their independance between 1945-1990.
2006-06-24 09:52:12
·
answer #3
·
answered by Dave A 2
·
0⤊
0⤋
European Imperialism is STILL occuring. In fact, France, that bastion of peace and understanding, engaged in numerous armed conflicts in Africa during the last century to preserve their colonial power on the continent, and continue to be a colonial force around the world today. So, European Imperialism is still alive and well!
2006-06-24 13:06:19
·
answer #4
·
answered by A Guy 3
·
0⤊
0⤋
when did imperialism in Europe occur? (during world war 2)?
Because they needed money, oil and people to fight for them.
2006-06-24 10:04:39
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
Wasn't going to respond to this as it seemed a stupid and bizzare question but then I saw the "answer" from "only the truth".
What complete and utter rubbish!!
I dont know where you come from but the schooling system has failed!!
2006-06-24 12:17:25
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋