English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

Within America the Great Depression aided Hitler's rise as America went into a state of isolation. Not wanting to be effected by outside factors and worrying about their own economic state America was not interested in what was happening across the pond in Europe. As WWI had weakened so much of Europe there was no leading country. America had come out of the first war primed to be the central power. This lead to an environment where Hitler was pretty much able to do what he pleased unchecked by external powers. Also, Hitler was respected at the time by many Americans because as mentioned in the first answer the Germans were under a massive economic crisis and Hitler was seen as leading his country out of it. Americans saw this economic growth and wanted similar growth.

2007-12-05 09:31:21 · answer #1 · answered by bubblybat 4 · 0 0

I have no idea...other than that the United Nations had exacted such punitive fines on Germany after WWI, that the Germans were experiencing a Great Depression along with the rest of the Western world. They were humiliated and were "ripe" for Hitler's megamaniacal rantings of the super race. He poured Germany's resources into the war machine and glorified the soldiers fighting for the father-land. This fed into their invasions of "inferior" countries, making Germany richer by raping country after country. The Germans were enriched in money and pride and they worshiped Hitler--especially the young, as he took them over in the Youth Groups and brain-washed them.

2007-12-05 16:58:29 · answer #2 · answered by Martell 7 · 0 0

fedest.com, questions and answers