Seek truth!
This was the real pretext to World War I. The Baltic unease and the English invading the German colonies led to World War I. So the British instigated the war, not Germany.
2006-09-01 14:10:25
·
answer #1
·
answered by Calvin of China, PhD 6
·
0⤊
0⤋
~If this is homework, you best crack the books a little harder.
Germany didn't invade Africa in WWI. The French and British did. Germany had a few African colonies and the British and French thought it would be a good idea to keep the resources in those colonies and the Germans apart. Since the colonies weren't very well defended, it wasn't much of an invasion. The British fleet could pretty much get the job done on its own.
If you refer to WWII, that's an entirely different matter, but you didn't ask. As to WWII, think Egypt/Suez canal. Think Morocco and Algeria and the Straits of Gibraltar. Think Tunisia, Libia and Arabia and oil. Think 11/11/18 and pay-backs.
There's all kinds of neat stuff out there. You ought to try reading some of it.
2006-08-25 05:57:18
·
answer #2
·
answered by Oscar Himpflewitz 7
·
4⤊
0⤋
They didn't - you are getting confused with WW2.
In WW1 they already had colonies in East and West Africa which only played a relatively minor part in the war which was more of a nuisance to the allies than a serious threat.
After the war Germany lost all of its african colonies to the British and French under the Treaty of Versailles.
2006-08-26 01:47:49
·
answer #3
·
answered by brainstorm 7
·
0⤊
0⤋
To weaken the British Troops in Egypt and neutralise the North Africa Flank of Allies
2006-08-25 04:39:57
·
answer #4
·
answered by ? 2
·
0⤊
0⤋
The World War II answer is Hitler didn't want to be in Africa at all, he only did it to shore up the Italians who were unable to handle the British.
2006-08-25 14:12:57
·
answer #5
·
answered by Will B 3
·
0⤊
0⤋
They didn't invade Africa, they already had colonies there and defended them.
2006-08-29 01:12:48
·
answer #6
·
answered by Ed M 4
·
0⤊
0⤋