English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-11-15 11:08:54 · 11 answers · asked by Tiara J 1 in Arts & Humanities History

11 answers

No. WWII began in 1939 when Stalin and Hitler invaded Poland at the exact same time and met in the middle (the Hitler/Stalin Pact) then Great Britain honored her treaty with Poland and WWII began.

On December 7th, 1941 Japan attacked Pearl Harbor, then on December 12th, 1941 Hitler declared war on the United States, which dragged an isolationist U.S. into the fray.

2007-11-15 11:19:54 · answer #1 · answered by Anonymous · 1 0

No. The war started before the attack on Pearl Harbor. Pearl Harbor started the U.S. involvement in World War 2.

2007-11-15 11:42:48 · answer #2 · answered by Anonymous · 0 0

Mark is right it happened after the start of the war. After Pearl Harbor though America entered the war.

2007-11-15 11:17:01 · answer #3 · answered by Caleb G 1 · 1 0

the war started in 1939 as mentioned previously....Pearl Harbour bought the US into the war but only with Japan.

The US did not actually move into the European theatre of war for a few years after Pearl Harbour. Initially the US were providing aid to Britain to help but were not fighting against Germany.

There is great documented evidence that the US thought that Britain was doomed and Germany would over run them, what happened was that the US were trying to get Canada to agree that the British Navy fleet by handed over to them in the event Germany did overun the UK (Canada had an agreement that should Britain fall the navy would be handed to them).

The main worry for the US was that if Britain was to fall and Germany took control of the Navy then America would fall due to the might of the German Navy (British, German, French and Italian navy mixed)

After alot of politics America finally agreed to send troops to France, once there alot of Soldiers were shocked as they were "green" and came across British, French and German soldiers who had fought for years and were battle hardened BUT if America had not sent troops over to Reinforce the Allied soldier then Europe would have fell to Hitler.

It does have to be said here that Russia DID actually help the Allieds through fighting from the East as the Osfront (East Front) was a place where a great many German soldiers perished under Russian pressure.

2007-11-17 04:09:54 · answer #4 · answered by Guy M 3 · 0 0

LOL World War Two was going on WAY before Pearl Harbour.. what are they teaching in American schools? no seriously hollyy Michael actually ALL British Commonwealth were fighting in WWII LONG before the US entered That includes your neighbour to the North, Canada. Canada was involved since the VERY beginning We're not Europeans, we're from North America. Therefore, not just European/Asians were fighting (Russia is considered BOTH European and Asian depending where you are in that country)

2016-05-23 08:06:17 · answer #5 · answered by diann 3 · 0 0

Nope, all it did was bring the USA into the Pacific Theater of Operations, or PTO. Germany declaring war on the US brought the US into the ETO. Both had been going on for years, the PTO since 1932, and the ETO since 1939.

2007-11-19 02:58:13 · answer #6 · answered by rz1971 6 · 0 0

No--WWII had been going on for years before the Pearl Harbor attack.

2007-11-15 11:14:02 · answer #7 · answered by Mark 6 · 1 0

No, It finally pushed America Into a war with Japan.
Then Adolf Hitler In his stupidity decided to declare
war on America a few days later.

2007-11-15 13:59:15 · answer #8 · answered by Mr. nixie 3 · 0 0

No, but it did get the US involved in the war. We were trying to stay neutral. The Germans invading Poland in 1939 started it.

2007-11-15 11:18:31 · answer #9 · answered by Frosty 7 · 1 0

It's what got America into the war fully.

2007-11-15 11:16:38 · answer #10 · answered by atdiw 3 · 0 0

fedest.com, questions and answers