English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

4 answers

Before WW II the USA was isolationist meaning America didn't join the League Of Nations or join any alliances. Mainly only concerned with South/North America and Asia since we had the Philippines. In summary more passive.

After WW II the USA took an active role in world affairs. America joined the United Nations, kept a large military presence abroad mainly in Japan and Germany, formed military alliances such as NATO, created programs like the Marshall plan to aid Western European economic recovery.

2007-03-24 18:20:48 · answer #1 · answered by Anonymous · 0 0

The US became isolationist well before World War I and attempted to return to that posture after that war.

After World War II, it had no choice but to assume a more involved posture with the rest of the world. Of course, only an idiot would classify this as "imperialistic."

2007-03-25 01:06:59 · answer #2 · answered by TheOnlyBeldin 7 · 0 1

Post WWI: Policy of Isolationism

Post WWII: Policy of Engagement

:)

2007-03-25 04:19:50 · answer #3 · answered by bryan 2 · 1 0

post-WWI, the US became isolated from the rest of the world. They didn't mess, bother, ask, worry, care about the rest of the international community.

post-WWII, the US became extremely imperialistic aka as it is now.

2007-03-25 01:05:09 · answer #4 · answered by Anonymous · 0 2

fedest.com, questions and answers