In some ways, the roots of WWII lay in the economic ruin of Europe after WWI. Much of the continent was devastated. Additionally, the Treaty of Versailles sought to punish Germany for starting WWI by limiting its military, requiring financial reparations to the countries it attacked, and barring the country from joining the League of Nations. This prevented Germany from becoming part of the rebuilding process and essentially made it a pariah in Europe. The economic and political chaos in Germany arguably gave rise to the success of the Nazi party, who promised law and order and told the German people that they should be proud to be Germans (contrary to the previous message that they should be ashamed because they started WW1). Once the Nazis came to power, they simply disregarded the ban on rebuilding the military, secretly building and stockpiling weapons until they had the firepower to overrun their neighbors and basically run amok across most of Europe in WWII.
2007-05-20 05:36:17
·
answer #1
·
answered by Kathy J 1
·
0⤊
0⤋
Many people believe that WWI and WWII are really the same war. There was just a very brief period of slight stability and quiet between the two.
Basically, WWI pissed the Germans off so much they felt the need to 'get back' at the people who screwed them over (namely the West). And they did.
2007-05-20 15:11:52
·
answer #2
·
answered by xsneaker_pimpsx 3
·
2⤊
0⤋
It could be stated that the war reparations that Germany was required to pay from WW1 were a contributing factor toward WW2. The amount of reparations required was 132 billion gld marks. Germany lost assets, mining, land ect as the result of ww1. Their economy was damaged. The mark was devalued greatly and the ability to pay the reparations wasn't there. Tensions rose for people in the lower middle class. Hitlers message, hit home with this group at the time. However the US did make loans to them for rebuilding. Do some research on European economic history post WW1, it might help. Search the following...Dawes plan, Young plan and Treaty of Versailles .
its a start.
2007-05-20 12:35:12
·
answer #3
·
answered by lorem_ipsum 3
·
0⤊
0⤋
Research the FACT that the defeat of Germany in WWI and the harsh terms of the Versailles treaty left the Germans broken and bitter. This paved the way for Hitler to use his rhetoric to come to power. If this paper is due on Tuesday, you better get rolling.
2007-05-20 12:25:14
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Well, the "peace" accords after WWI (League of Nations) disarmed many of the allies. When Hitler came into power, he used this (and everyone's overriding desire for "peace") to build up his military to very strong levels, and when he attacked, there was really nothing the European countries could do about it.
They were also still weary of WWI, so they wanted to avoid war at all costs. When you try so hard to avoid war like that, war is inevitable.
2007-05-20 12:23:46
·
answer #5
·
answered by bigtalltom 6
·
0⤊
0⤋
If there never was a WWI there could never had been a WWII
2007-05-20 12:22:10
·
answer #6
·
answered by icemunchies 6
·
0⤊
0⤋
No, because it had only the most tenuous connection to World War II. Both wars were independent events that began for very different reasins.
2007-05-20 14:03:57
·
answer #7
·
answered by Randy 7
·
0⤊
1⤋