WWI is significant for starting WWII. this is because of how bad they screwed Germany over. WWI was fought in a terrible style, trench warfare. a horrific stalemate held on the western front for many years. the western front was in france. after america stepped in (as we always have, and probably will again) for the french, how do you think france wanted to treat Germany, who had been destroying it's countryside, tainting its air with posion gas, killing husbands, fathers, sons, and brothers?
you guessed it, they put Germany in the ground, big time. google search the treaty of versailles, and you'll almost feel bad for prenazi germany. yearly tributes were sent to france and britain. after a while, Germany was really unwell off. there were no jobs, people were in debt, and guess who everyone German hated - France and Britain, who were taxing them so heavily, and America, who had defeated them.
well, thats how WWI affected Europe. if you think that's bad, it hit even harder right here in the U.S. you see, when soldiers returned from the trenches, they wanted to PARTY. new fashions and new technologies swept america, and everyone wanted to have them. some "genius" came up with the idea of credit - buy now, pay later. its a shame, because it was a huge hit. unfortunately, people bought now, but they couldn't pay later. after ten years, in 1929, america was at its worst economic state since the american revolution. the great depression had begun.
i've got to get back to my own homework, but i hope this has helped you. =)
2007-03-12 11:45:32
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
It would take an awfully long time to go into everything but basically you've answered your own question by saying that it started WW2.
The terms of the German surrender were such that the allies basically demanded such huge reparations from them that the German economy nosedived and caused ideal conditions for someone like Adolf Hitler to gain power. In his early days as Chancellor he did a lot of economic good for the country and had a lot of people behind him, until they realised what a despot he really was.
In England WW1 was also the beginning of the end for the class system as so many men from all walks of life were thrown together in the trenches and never came home that the aristocracy were left without many of their servants and had to learn to do things for themselves.
2007-03-12 11:52:21
·
answer #2
·
answered by bilbotheman 4
·
0⤊
0⤋
It eventually caused WWII, LOL! That's funny. WWI had a big impact in the sense that the monarchies were ended after it. Germany, the Ottoman Empire and all of their allies had to give up all of their territories. Germany couldn't have a standing army because of the Treaty of Versailles (When Hitler took over he started to create an army and pissed alot of people off). By getting involved, the U.S changed from being an isolationist state to being an interventionist state (because it got involved in a war that technically had nothing to do with it). Russia pulled out of the war before it ended and became communist (one of the largest changes).
2007-03-12 11:47:40
·
answer #3
·
answered by Vince R 5
·
0⤊
0⤋