English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-02-12 18:56:26 · 2 answers · asked by lirael1019 1 in Arts & Humanities History

2 answers

Britain gained former German colonies in Africa.
France regained Alsace - Lorraine.
France and Britain gained control of the Middle East from Turkey.
They also gained war reparations under the Treaty of Versaillles.

2007-02-12 20:27:38 · answer #1 · answered by brainstorm 7 · 0 0

Gained as well were German colonies in the Pacific and hegemony over European affairs without German involvement. Germany was demilitarized (much like Japan today). More importantly I think was what Britian and France gained in an intangible sense. This was a determination never to let such a war break out in Europe again. This noble sentiment had negative consequences and resulted in a period of appeasement of Nazi Germany that allowed it time to rearm and prepare for war.

2007-02-13 10:22:18 · answer #2 · answered by corydon 2 · 0 0

fedest.com, questions and answers