English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

Yes. After the Spanish American War it fully became an Imperialist power. When the United States took over Hawai'i, it was that seizure that started the process of Imperialism.

2007-10-21 08:10:33 · answer #1 · answered by Anonymous · 1 1

Not really. The United States does not have colonies, but we
did acquire many territories in the late 19th century, including Hawaii, Midway Island, and Samoa. After the Spanish-American War, the United States took posession of Puerto Rico and Gitmo.

2007-10-21 11:52:01 · answer #2 · answered by wichitaor1 7 · 0 0

i like u.s.. i think of she could are starting to be somewhat off course... its greater useful to proceed to apply international relatives quite than brute tension even while u cen decrease back it up..yet i think of u.s.'s issues are greater of individuals taking the easy way out, company greed, and a few small ideas's selfishness: quite than an all out government changing conspiracy. i think of u.s. many times trys to do the incredible subject, despite if its somewhat immature approximately it..we r nevertheless a toddler united states in spite of everything.

2016-10-13 10:35:01 · answer #3 · answered by ? 4 · 0 0

fedest.com, questions and answers