There is no U.S. Imperialism, never has been, and never will be, maybe you should look up what imperialism is and learn that when you twist the meaning or word the words you use will loose there value.
2007-03-01 02:10:55
·
answer #1
·
answered by DeSaxe 6
·
0⤊
1⤋
We were more subtle! The European nations went out and carved up huge chunks of Africa and North and South America and took what they could while they could. Remember that the sun never set on the British Empire! The US, despite what dickn2000 said, did have its own Imperialism; how did we acquire Texas and New Mexico and Arizona? We still are in Cuba and created the nation of Panama from Columbia when we took over the building the canal. Additionally we took the Philippines, Puerto Rico and Guam after the Spanish American War. Also we acquired a lot of our territory through good old American money; bought both Alaska and the Louisiana Purchase which doubled the country.
Wikipedia defines imperialism as "the policy of extending a nation's authority by territorial acquisition or by the establishment of economic and political hegemony over other nations. This is either through direct territorial conquest or settlement, or through indirect methods of influencing or controlling the politics and/or economy. The term is used to describe the policy of a nation's dominance over distant lands, regardless of whether the subjugated nation considers itself part of the empire."
2007-03-01 02:57:33
·
answer #2
·
answered by jsb125 1
·
0⤊
1⤋
Expansion and imperialism is thought to be the way for any growing nation. From Suleyman the Magnificent to Napoleon, expansion was a way of ensuring the continuing of the empire. I believe this question is a specific reference to european imperialism during the grand Colonial epoch, oui? Definingly, the countries would "take charge" of some overseas territory and exploit it as the mercantilists wanted. I was a way of being richer, expanding the nation/culture, converting "lost souls" and also, latter, the "tedious" burden of the White Man. At this time, US was either a New England colony, or a new independant state with its own priorites and a Wild West to conquer. At this time the european imperialsm flourished, conquered the Globe. After the US, Latin America almost single-handedly for rid of the restraint of colonialism. But imperialism continued to flourish: Africa was no longer a few colonies and shipping points for slaves, etc. it was divided up more and more untill the whole surface was saturated (in part thanks to the railways and telegraph). But then came the 20th century, and european imperialsm in this "classic" state declined, though it continued to be the envy of the Italians and the Germans who wanted their share.
DeSaxe and dickn200 seem to forget the american-spanish war in 1898 which gave to the US not only control over Cuba's independence (and thereafter in realtime untill 1959 :P), but gave them FULL control of Guam, Puerto Rico and the Philippines. For a few years, these were american colonies in the same sens as the traditional european imperialism. and this tied completely with the american spirit of the time. their "frontier" to the West had been fully conquered and the next frontier would be those countries.
Mainly though, expansion of this "direct" form of imperialism didn't last long after the turn of the century and what took its place, neo-colonialism is a more indirect and subtle form of imperialism which charactarises the american way. Do not expect though, that the europeans havent also taken up this form for neo-colonialism. Though I believe the european neo-colonialism is more benign than the american simply because they have less exploiting enterprises to keep up with the americans.
2007-03-01 02:53:59
·
answer #3
·
answered by Anonymous
·
0⤊
1⤋
WHAT U.S. Imperialism?? I challenge you to name a single incident of U.S. Imperialism. I, on the other hand can cite time after time where the U.S. had the chance to become Imperialists and chose not to do so. Mexico, Cuba, Grenada, Panama, Germany, to name a few. The U.S. conquered these countries and had the chance to make them U.S. territories. THEY DIDN'T!! You need to learn what the term "Imperialism" is before you spout off about something of which you have absolutely no knowledge!!
2007-03-01 02:18:07
·
answer #4
·
answered by Anonymous
·
1⤊
1⤋
It was no different. The US declared war on Spain with much the same colonial ambitions as western European powers had and like them they used a lot of rhetoric to try and legitimise their actions. Why do Americans conveniently gloss over the Spanish/American war- are they ashamed of the fact that it exposes the US as an imperial power like all the rest?
2007-03-01 02:37:36
·
answer #5
·
answered by Anonymous
·
0⤊
1⤋
rather than ruling directly as Eu powers did, the US finds indigineous lackeys to do it for them. Sublter and more effective, as it provides the pretense of home-rule, and disgruntled locals blame their local leaders first.
PS your use of "did" implies it ended at some point.
2007-03-01 02:28:16
·
answer #6
·
answered by kent_shakespear 7
·
0⤊
1⤋
Probably because both were imperialistic.
2007-03-01 03:17:55
·
answer #7
·
answered by sofista 6
·
0⤊
1⤋