English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

for some project at school

2006-12-12 06:22:25 · 8 answers · asked by Hannah! 1 in Arts & Humanities History

8 answers

This is a topic I did some serious research on when I was working on my bachelor's degree in history. Short answer, he affected us a lot, in ways almost nobody is consiously aware of.

I think the greatest harm he did to America is he shaped how we think of race. For example, who reading this thinks that Jews are a race? You think that because of Hitler's propaganda minister, Josef Goebbles made it up. They aren't. Judeism is a religion.

He also managed to demonize science. Since all the evil he did was supposedly backed my modern science, then he came to represent the fruition of the secular world. In reality, we was a Christian, and a very religious/superstitious one at that. But, now people think of science as bordering on evil because of him.

Otherwise, I think that America's affect on Nazi thinking was more profound than vice-versa. For example, early in Hitler's rule, he and his officers frequently praised America for their "eugenic" policies. They just took it one step further.

Since this is an area that I have researched extensively, and am very interested in, I'll give you all the info you need. email me. I'll give you my email address in private. I'll check my email tonight again.

Great question, by the way.

Peter S

2006-12-12 06:35:44 · answer #1 · answered by Peter S 2 · 0 0

Hitler had no effect on America, other than to give the U.S. an opportunity to aid the European countries his mindless rampage had oppressed and enslaved.

His desire for an Arian world order gives present day Americans a reason to be more cognizant of our open policies of freedom and how not paying attention opens the door for abuse and potential domination of the weak in society. The last time I looked, there weren't that many blue eyed blonds anyway!

2006-12-12 06:31:57 · answer #2 · answered by Anonymous · 0 0

*Immigration increased indirectly as a result of WWII.. Not only did the threats Hitler posed to certain cultures drive them from their homes & nations.. but a booming US economy became more and more attractive to many Europeans after WWII.

*Economy (sort of mentioned above). Without WWII, the United States would have had much tougher road back from the Great Depression. WWII was certainly a financial godsend for the US economy and without the ideals of Adolf Hitler, the war doesn't happen.

This should help you with your project.

2006-12-12 06:35:59 · answer #3 · answered by Jape Coyote 2 · 0 0

In a small way, Hitler "affected" America through his own beliefs.

There are a lot of (if hidden) white supremacist groups in the U.S.

2006-12-12 06:29:54 · answer #4 · answered by Ambassador Z 4 · 0 0

in the usual, expectable ways associated with enduring world war. plus, it bankrolled the bush crime family, war profiteers from that period up through the present. anyone who has not been lobotomized through one means or another realizes the negative impact that "dynasty" has had on the planet, eh? ;-)

2006-12-12 06:32:26 · answer #5 · answered by drakke1 6 · 0 0

' Affect. ' He didn't, Hannah. He gave a lot of people a reason to do something people wanted. Start a war and profit from it. That was all.

2006-12-12 06:25:53 · answer #6 · answered by vanamont7 7 · 0 0

Well, his nephew lived in the US, so he "affected" america by proxy.

2006-12-12 06:32:20 · answer #7 · answered by carltuesday 2 · 0 0

You need to be a bit more specific.

2006-12-12 06:28:26 · answer #8 · answered by NAQ 5 · 0 0

fedest.com, questions and answers