Briefly, the Nazi Party of Germany began to invade other nations in Europe. They signed a treaty with Italy and Japan promising to stay out of each other's way in their struggles to take over the world. One by one nations were taken over by the Nazis of Germany, and other nations did nothing to save them, until finally Poland was invaded in 1939 and the rest of the nations decided enough was enough and World War II began.
The U.S., more or less neutral in the war, joined the war two years later after Pearl Harbor was bombed by the Japanese.
2006-12-04 07:47:54
·
answer #1
·
answered by plasmasphinx 2
·
0⤊
0⤋
Zach is basically correct.
Germany had been making agressive moves in Europe for several years, and finally launched an all-out invasion of Poland in September, 1939.
Japan had been in China for almost a decade, and the war escalated with the attacks on Hawaii and the Phillipines in 1941.
So, German expansion and Japanese militarism.
2006-12-04 07:49:35
·
answer #2
·
answered by parrotjohn2001 7
·
0⤊
1⤋
Germany invaded Poland in 1939.In Asia Japan had invaded China years before.
2006-12-04 07:47:17
·
answer #3
·
answered by Dr. NG 7
·
0⤊
1⤋
Pearl Harbor, I'm pretty sure. We were involved in WW2 but didnt send massive troops over to Europe until Pearl Harbor, since Japan was Hitler's ally.
2006-12-04 07:50:36
·
answer #4
·
answered by zozosammy 2
·
0⤊
1⤋