English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-01-05 22:25:13 · 13 answers · asked by Lanre S 1 in Politics & Government Politics

13 answers

Only one thus far.

Iraq.

And that is going about as well as every country that England tried to colonize.

2007-01-05 22:28:58 · answer #1 · answered by Anonymous · 1 2

The USA colonized most of the USA new states.In the South and West.and they had civil war back in history some say it was a war to free the slaves but that was not the real issue it was independent from North colonialism.

2007-01-06 06:51:53 · answer #2 · answered by Anonymous · 1 0

Most of the places mentioned in this list by everyone were COLONIES of Spain and handed over after losing a war. Most of the current USA was Taken by the Brits the French, the Spanish and Russia prior to 1776 or prior to any expansion of the new country. Most of the Indian wars occured within what was considered US territory through Purchase from these countries or through war with them. A piece of Germany was ours and we Gave it Back as we did Japan and the Philippines.

2007-01-06 06:58:56 · answer #3 · answered by Anonymous · 1 1

The only remaining colony under US control is Puerto Rico (from July 25, 1898 until TODAY). But Puerto Rico is a colony because they are too divided to decide whether they want independence or statehood. Their status as commonwealth is a joke that only pro-commonwealth party member can believe.

Puerto Ricans were made US Citizens in 1917, conveniently to fight in WWI. But they have been heroic in their participations in all wars since WWI.

2007-01-06 07:09:37 · answer #4 · answered by David G 6 · 0 1

U.S. past and present territorial possessions

Guano islands annexations
Annexation of Hawaii
Annexation of Spanish Colonies following Spanish-American War
Guam
Philippines
Annexation of American Samoa
Annexation of U.S. Virgin Islands
Trust Territory of the Pacific Islands

Overseas interventions

Interventions in Latin America
Interventions in Asia
Interventions in Europe
Interventions in the Middle East

2007-01-06 06:32:20 · answer #5 · answered by Brother Mike 4 · 6 0

If colonised means the same thing as fertilised in American English, then the Americans have colonised most of the planet. The first real colonisation took place during WW-One right here in the 51st State, UK.

2007-01-06 10:14:14 · answer #6 · answered by Anonymous · 0 0

Why not start with nations in the country itself. Cherokee, Apache, Choctaw, Comanche ad infinitum.
Then go into the pacific and atlantic islands, mexico south america etc.

2007-01-06 06:50:07 · answer #7 · answered by Anonymous · 1 0

I don't think the USA has colonised another country, although they occasionally invade one (sometimes for a good reason, in fairness).

The nearest thing would be Puerto Rico, which sort of declared itself to be part of the USA (although it does not have the status of a state).

2007-01-06 06:32:11 · answer #8 · answered by Well, said Alberto 6 · 1 0

The US has no colonies. We came a long a little too late. By then, most of the world had been discovered. We do have territories and protectorates, but all of those places still have their native populations.

2007-01-06 06:31:11 · answer #9 · answered by Paige D 2 · 0 1

I am glad to see that one person THORGIRL has a good understanding of actual American history. Not the BS that PC advocates are trying to spout with their revised America is a warmongering historical monster that has crushed and annihilated everyone they come into contact with. Do they actually teach history in school anymore?

2007-01-06 07:13:13 · answer #10 · answered by mark g 6 · 1 2

fedest.com, questions and answers