English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

i need to know so i can do my home work for school

2007-05-09 11:49:17 · 5 answers · asked by qu24me 1 in Politics & Government Other - Politics & Government

5 answers

You don't give a time frame in your question so I will assume your teacher is talking about the late 19th and early 20th centuries. After the Spanish American war the USA acquired Guam, Cuba, and the Philippines, where we enforced our rule and squashed any resistance in an effort to improve the native cultures, whether they wanted us to or not. Look up Banana Republics, in the early 20 th century U.S. marines spent a great deal of time in central and south America protecting our sphere of influence and greatly disrupting local politics. Also look at Filmore- he sent admiral Perry to Japan to negotiate trade and an open door policy when this at first didn't succeed he sent him back with a modern fleet capable of blowing the Japanese capitol to bits. This convinced the Japanese to have dealings with us- also got them started on rapid modernization that would later bring them into even more conflict with us. Anyone who says we weren't imperialistic has no understanding of history or vocabulary. Today US imperialism is of a more economic nature, very different then the past but seeking the same results.
Mac
historian

2007-05-09 12:59:19 · answer #1 · answered by Mac 3 · 1 0

While I can't actually agree that America (USA) is or ever was an Empire in the strictest sense, I can say that the "American way of life" or democracy, tolerance and freedom is something the USA wishes to spread to other contries.

Here's what the Wikipedia says as an intro to the subject:

"American Empire is a term sometimes used to describe the historical expansionism and the current political, economic, and cultural influence of the United States on a global scale.

It is usually part of a politically charged debate which involves three basic questions:

Is the United States currently an empire?
If the United States is an empire, when did it become one?
If the United States is an empire, is that good or bad?
However, there are also more neutral uses of the term."

2007-05-09 19:01:13 · answer #2 · answered by Benjamin A 3 · 0 2

Sarpedon (above):

Ummm... have you ever actually run across the word Imperialism before? We certainly are mighty imperialistic, and have been for quite some time. Please don't let me interrupt your fantasy, but we are also generally English speaking, and we like to fly a flag with red, white, and blue colors... yet, we're not Britain. Hmmm... see... there is also a tie-in there.

In other words: how silly do you have to be to deny American imperialism...

In answer to the originally stated question: America is imperialistic, although evidently the first answerer isn't.

2007-05-09 19:08:28 · answer #3 · answered by Blackacre 7 · 0 1

725+ military bases in a 130+ foreign countries if that is not an empire I have no idea what other word could be used.

To find out the reasons you would have to research the military industrial complex that's the beginning of all empires.

2007-05-09 19:27:43 · answer #4 · answered by Jose R 6 · 1 1

America is not an imperialistic nation. Have your teacher look up the world imperialism.

2007-05-09 18:59:43 · answer #5 · answered by sarpedons 3 · 0 5

fedest.com, questions and answers