No. All German colonies were removed from German control following WWI.
Togo, Kamerun, German East Africa and German Southwest Africa in Africa went to Great Britain and France.
Marshall Islands, Caroline Islands, Mariana Islands, Samoa, and German New Guinea in the Pacific went to Great Britain, Japan, and the US.
Kiauchau on the Chinese mainland (German equivalent of Hong Kong and Macau) went to China.
2006-12-22 01:55:41
·
answer #1
·
answered by dollhaus 7
·
2⤊
0⤋
One could perhaps say that Germany treated Poland and some other areas in the east like colonies during the war. But typical overseas colonies they did not have since 1918.
2006-12-22 09:53:09
·
answer #2
·
answered by mai-ling 5
·
0⤊
1⤋
Not during WWII, but they did prior to World War 1.
WWII They were basically held to only their land with many restrictions such as no navy or no air force. Hitler defied the Treaty of Versailles, and then built those armed forces.
So, no. They did not have colonies.
2006-12-22 09:46:24
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
This question is in the wrong forum, however, the answer is
no.
MERRY CHRISTMAS and have a nice day.
Thank you very much, while you're up.
2006-12-22 09:42:09
·
answer #4
·
answered by producer_vortex 6
·
0⤊
1⤋