English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

It's not that I think that our nation is truly imperialistic I just couldnt think of a better word

2007-07-04 09:40:31 · 12 answers · asked by PatriotKid 1 in Politics & Government Other - Politics & Government

12 answers

I changes to what suites us.

2007-07-04 10:10:47 · answer #1 · answered by alana 5 · 0 1

Our foreign policy changed from isolationism when we entered World War I, in 1917, though imperialism is not an accurate description of our foreign policy. Our nation has not sought an empire, contrary to what the media would want us to believe. There are few examples where we have fought specifically to obtain land and establish any sort of provincial government that is run by Americans (none of which exist since we left our isolationism behind).

2007-07-04 16:59:04 · answer #2 · answered by Bryan F 3 · 1 0

Other than pre 7/7/41 WW2 policy, I believe isolationism has
rarely applied. We have tried to use National Defense as our
standard policy.

Since 9/11, I think survivalism trumps imperialism as an applicable yet debatable word. Good question.

2007-07-04 16:57:31 · answer #3 · answered by Answernian 3 · 0 1

we're not an imperialistic power. The British Empire was imperialistic, Neopolionic France was imperalistic, Rome was imperialistic, the Third Reich was imperialistic. We're a democracy. We spread democracy where we can. If we were truly a democracy, would Japan be an autonomous nation?

2007-07-04 17:21:50 · answer #4 · answered by arkainisofphoenix 3 · 1 0

Our economic imperialism was a byproduct of taking over the world and then giving it away at the end of World War II by demobilizing over 10 million men and women in uniform. They needed jobs, and there was no market for new tanks, battleships and bombers at the time, so the Commerce Department sent trade missions to the four corners of the world, and the rest is history.

Economic imperialism has come with a price in blood. Every time there is a war in any place where Coca-Cola, General Motors, or any other American brand is sold, we get talked into sending troops there eventually.

2007-07-04 16:51:21 · answer #5 · answered by a2zresource 1 · 1 1

Because we were mugged by reality in WWII. We abandoned the world after WWI and failed to help it get back on its feet after that destructive conflict. This led to the rise of factors that caused WWII.

Also, the world got smaller from a technological perspective. It has become easier to wage warfare on a global scale where as before we were isolated from the rest of the world. Now we are 30 minutes from nuclear destruction.

2007-07-04 16:48:47 · answer #6 · answered by The Stylish One 7 · 1 0

During WWI the US became a major player in international affairs. We are not imperialistic.

2007-07-04 16:45:13 · answer #7 · answered by regerugged 7 · 1 0

You know...Rome built an empire out of the idea of bringing civilisation to the barbarians...Pax Romana and all that.

Doesn't that sound like "spreading democracy where we can"?

2007-07-04 17:30:22 · answer #8 · answered by Morkarleth 2 · 1 1

Imperialism? I would have to disagree. We aren't forcing anyone to become democratic, we simply help those who are overrun by brutal dictators and whose governemtns are imminent threats to out security.

2007-07-04 16:46:14 · answer #9 · answered by AAA 3 · 1 1

Technically it was after the Spanish American War... but we never reeally stood down after World War 2.

2007-07-04 17:01:15 · answer #10 · answered by planksheer 7 · 0 2

WWII, to be more specific Dec 7th 1941.

2007-07-04 16:44:28 · answer #11 · answered by Anonymous · 0 1

fedest.com, questions and answers