English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Thank You,

2007-05-18 05:59:24 · 5 answers · asked by Ms.Single 1 in Arts & Humanities History

5 answers

It wasn't!! It was the treaty that ended WWI and eventually paved the way for Hitler's rise to power and WWII!!

Chow!!

2007-05-18 06:49:11 · answer #1 · answered by No one 7 · 0 0

The treaty of Versailles was signed to end world war one.

But assuming you meant world war two: The treaty set very heavy penalties for Germany to pay as reparation to the winners. It also severely limited German military build up and production.

This created a huge strain on the German economy, which basically collapsed. The resulting hardships left an environment that Adolf Hitler was able to use to gain power. Thus leading to World War Two.

2007-05-18 09:34:15 · answer #2 · answered by rohak1212 7 · 0 0

The Treaty of Versailles is the peace treaty that ENDED WWI. Some peopel blame the severe penalties and reparations imposed on Germany in the treaty for leading to WWII.

Against the wishes of Woodrow Wilson, the European powers sought to severly punish Germany for WWI. The treaty made Germany give up lots of territory and make monetary reparations to the victors. As a result of the treaty Germany was plunged into a severe economic depression. After a decade of being broke and feeling put-upon, Germans fell under the spell of a man who promised to restore Germany to its status as a world leader, that man was Adolf Hitler.

2007-05-18 06:16:59 · answer #3 · answered by j76spirit 3 · 3 1

no as it came at the end of the war, it could not be considered as a cause

2007-05-18 06:24:59 · answer #4 · answered by roadrunner426440 6 · 2 0

It can't. It came after WW1.

2007-05-18 06:05:27 · answer #5 · answered by brainstorm 7 · 5 0

fedest.com, questions and answers