English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

This is for an assignment. Please be appropriate.

2007-04-22 14:26:14 · 12 answers · asked by Gymnast 1 in Arts & Humanities History

12 answers

Actually, this is an interesting historical factoid. While not an invasion, German forced did intrude into US territorial waters.
During the war, German subs patrolled very close to the US Atlantic and Southern coasts, looking for merchant traffic. They were sighted by coast patrols, but the stories were kept out of the papers to avoid panic.
German records recovered after the war confirmed the shallow water sub visits.

2007-04-22 16:45:36 · answer #1 · answered by adphllps 5 · 0 0

No. Hitler invaded Poland in 1939. The Germans never invaded the U.S. We went over there to fight World War 2.

2007-04-22 21:33:05 · answer #2 · answered by Flash1957 3 · 0 0

If Hitler showed up in North America, we would have kicked his hairy chumpkas out to sea, and that would have been appropriate. He was never here. He invaded Poland, then Britain thought, okay, enuff. Then Canada thought, well that's enuff too. The war was on. Poland fought like the dickens for their cities. Then herr Hitler there thought he's going to take over Europe. Well, that wasn't a good thought. Then Japan decided they're going to bomb Pearl Harbour.. Huge mistake. Then we all got on board and took out Hitler, brick by brick.

2007-04-22 21:51:53 · answer #3 · answered by Kilty 5 · 0 1

Unless you mean emotionally. Physically no. Hitler's armies went into Poland in 1939.

2007-04-22 22:09:30 · answer #4 · answered by ernie 2 · 0 0

No, he didn't. He invaded Poland, and pretty much all of Europe, but the world got involved, hence the name World War 2. Hitler's troops didn't ever get as far as the US, but they may have done if England and America hadn't fought them off.

Perhaps you are getting things confused with the fact that the United States almost had german as their official language, and English only won in a vote very narrowly.

http://www.jacksblog.co.uk - Free daily music, weekly freebies, now with daily suduku!

2007-04-22 21:37:32 · answer #5 · answered by Jack Creighton 2 · 1 0

Nope. Hostile foreign armies have not invaded the continental United States in force since the War of 1812. He invaded multiple countries in 1939, but the only one that comes to my mind is Poland.

2007-04-22 21:34:30 · answer #6 · answered by John 3 · 0 2

no, many americans supported Hitler in 1939

2007-04-23 01:23:30 · answer #7 · answered by brainstorm 7 · 0 0

No. He started WWII and invaded Poland.

2007-04-22 21:32:17 · answer #8 · answered by scorpion43_db 3 · 0 0

no, germany declared war on the US after pearl harbour bombing, honouring their alliance to Japan, germany did send some agents into the states after this to disrupt production and destry factories, but they didnt get much done

the atlantic ocean however was a killing ground for any who dared to enter

2007-04-22 22:21:42 · answer #9 · answered by Seamus S 3 · 1 0

no it was Poland that Germany invaded

2007-04-23 03:50:37 · answer #10 · answered by buster5748 3 · 0 0

fedest.com, questions and answers