English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If the United States is an empire, when did it become one?
If the United States is an empire, is that good or bad?

2007-06-04 18:23:13 · 5 answers · asked by Anonymous in News & Events Current Events

5 answers

No. -But McDonald's is! :)

2007-06-04 18:28:09 · answer #1 · answered by Joseph, II 7 · 1 0

Empire: a group of countries under a single authority

Not sure the U.S. qualifies under that definition. Though we influence and hold partial sway over some governments, we hardly rule them.

2007-06-05 11:40:39 · answer #2 · answered by Bob Mc 6 · 0 0

Yes it is. Americans like to believe the propaganda that they're only omnipresent because they've "been invited in" by other nations, or that they'e "liberating" people and "spreading democracy", but what their gov't is really doing is putting its fingers in all the pies. It's an economic empire that really took off following WWII

2007-06-05 01:32:55 · answer #3 · answered by Anonymous · 1 3

Yes it is. It's an evil empire

2007-06-05 01:54:46 · answer #4 · answered by Anonymous · 0 2

no, but try Walmart! he he he....

2007-06-05 01:30:50 · answer #5 · answered by Krytox1a 6 · 1 0

fedest.com, questions and answers