English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If so, is "Imperium Americanum" a good thing or a bad thing?

2007-01-24 18:21:49 · 12 answers · asked by Anonymous in Politics & Government Other - Politics & Government

Thank you all for the interesting responses so far. I've been reading some various articles on this subject and wanted to know what everyone's thoughts were.

Most respondents almost unanimously believe that in order for there to be an empire, there must be conquest. That's a prerequisite that I hadn't thought of.

Thank you again.

2007-01-24 18:47:52 · update #1

12 answers

Hmm. People certainly do fail to recall that the means by which the United States acquired all of the land on which it is currently built in the first place was by ... TAKING IT, paying the Native Americans essentially nothing for it, negotiating with Mexico for land in exchange for not kicking their a.sses. The leftover results of this mass taking of land (which was only _very_ recently halted by the U.N.) include Puerto Rico, Guam, and the U.S. Virgin Islands, which are called 'insular areas' and are still provisionally controlled by the U.S. government (http://en.wikipedia.org/wiki/Insular_area ). The U.S. still has its claws in them. And any nation that is still colonizing others - whether taking over the country's government or not - is still acting in an imperialist manner. Though we do not colonize other countries' governments (or even, in some cases, colonize their land), we colonize their minds.

Our capitalist system has no doubt made other countries the colonies of our business and consumer culture and language, all of which have essentially replaced or at least marginalized other cultures and native languages. More people speak English as a second language than a first language! English is one of India's official languages, and there were certainly no English people to speak of before Britain colonized India to make money on its 'exotic' products and resources. This would be analogous to the U.S. making Spanish one of its official languages. An imperialist viewpoint helps explain why this will never happen: Spanish is not as "important" as English. It is not the U.S.'s "original" language. Yet English was certainly not India's original language, and it was nonetheless embraced wholeheartedly. To say that English is somehow "more important" than another language for whatever reason (probably money) is to ipso facto adopt an imperialist attitude.

So, yes. The U.S. is an empire - at least culturally and economically.
________________

There is always a conquest, except it's not for land anymore. It's for money.

2007-01-24 19:09:51 · answer #1 · answered by Anonymous · 0 1

The United States is being used for that purpose. After the collapse of the Soviet Union, the NeoCons decided this was an excellent opportunity to use the power of this country to seize the moment created by the vacuum left where the Soviets used to be. Their present ambition was to control the Persian Gulf while they could, sort of a shoot first and ask questions foreign policy. They believe in doing to others before they do to you so they decided to invade Iraq and Afghanistan. The Empire they support is more economic than political, and you can have hegemony without making the conquered state a colony. The results have been a disaster for the US, internal division and massive debt http://www.uuforum.org/deficit.htm and catastrophic for the Middle East, threatening to make the whole region go up in flames.

2007-01-24 18:38:18 · answer #2 · answered by michaelsan 6 · 0 1

America as Empire? I believe there is reason to believe that, it seems we have picked up the pieces of the british empire as protectors of the resources that developed nations require to survive. Much of the Middle East ( including Iraq) was administered by the British following the fall of the Ottoman Empire in WWI. Due to their devastion in WWII, the Brits have opped for a lesser role and their offspring the US have filled their shoes. ( Israel is propped up to protect our interests in the area and we now occupy Iraq for similar reasons.) I also encourage everyone to read up concerning our occupations and incursions into Mexico, the Phillipines, etc. the list goes on.

2007-01-25 03:25:44 · answer #3 · answered by coderednation2007 2 · 1 0

No, but I consider Western Democracy to be an "empire" of sorts. It's a good thing.

An "empire" is not automatically bad or good, it just is what it is. The Communist world was an empire (mostly the Soiviet Empire). It was, as Reagan said, an "evil empire." Nearly 100 million people were killed for Communism in the 20th century.

Love Jack

2007-01-24 20:48:55 · answer #4 · answered by Jack 5 · 0 1

The United States is not an "empire". We don't have an emporer, we have an elected government. The constant rotation of our nation's leaders gives us fresh ideas in administration and allows us to get rid of bad policies by having the people remove the unwanted leaders.

Empires normally invade countries and keep them. We invade only in self-defense and we always give them back. If we didn't, we'd be ruling most of Asia and Europe today.

Well, after all, we did conquer Italy, France, Belgium, Luxembourg, Germany, Japan, the Philippines, most of the south Pacific Islands, etc.

What kind of an idiot emperor goes to the trouble to conquer countries and then allow them to have their land back with free, prospering governments?

2007-01-24 18:30:37 · answer #5 · answered by Anonymous · 0 2

No it is the Ponderosa and Ben Cartright and his 3 sons are in Charge? Duh? No way it is not that at all. It is not an Empire either but it could be if we allow all these governing bodies pretend they are senators of Rome. I think that we need to be more down to earth like Abe Lincoln was and that is a fact. We need a poor man in office, and not because of color either, if we elect anyone it should be for moral fiber and not religion, or because he has a family in politics or a parent or cousin that was a president. It is far too easy to whip us around like jello. We need someone with strong character, and that is an American and not out to be one, and for the People.

2007-01-24 18:30:17 · answer #6 · answered by Anonymous · 0 2

the USA is about as strong as a rustic would properly be. After elections, you do not see rioting, you do not see coups. Dems would compared to Reps and vice versa - yet contained sooner or later, all of us appreciate the want of the human beings and the flexibility of the campaign.

2016-12-03 00:43:34 · answer #7 · answered by bartow 4 · 0 0

No, any country that is recognized under the United Nations must adhere to international law. One of them explicitly forbids taking on new territories. An empire by definition looks to increase its own size and take more people under its rule.

2007-01-24 18:34:05 · answer #8 · answered by Anonymous · 0 2

Being an "EMPIRE" are we supposed to STRIKE BACK....?? I just had to say that....

No we are not an empire but a nation that has it strengths and downfalls. Our government sees fit to send troops to "help" other countries that are crying out for help. Then when we do, we are ridiculed to the point where we are rejected as such a "Super Power". We are still human and all make mistakes, BUT, I am still PROUD of my country.

Southern by birth
American by the grace of GOD

2007-01-24 18:48:12 · answer #9 · answered by Mary D 4 · 0 2

NO
Empires were created by "acquiring" other countries/territory and adding it to their nation....
The US is a large country but has never decided to retain any conquered nation ...
NO empire

2007-01-24 18:29:15 · answer #10 · answered by SURECY 3 · 3 1

fedest.com, questions and answers