English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-08-10 11:10:48 · 30 answers · asked by Anonymous in Society & Culture Languages

30 answers

The North American continient was all Indians. Then the spaniards came to Mexico or Central Amerias and the Europeans came to plymouth rock in the East Coast.
Everybody use to belong to someone else.
Iraq was Babylon. Iran was Persia and most of the rest was Isreal. So nothing stays the same.
Now it is a mompog of everyone.

2007-08-10 11:18:43 · answer #1 · answered by cloud 7 · 1 4

Yes if you look at a really old map you'll see that the west of the country was owned by the spanish and the central part was frances and the east was englands

2007-08-10 18:23:26 · answer #2 · answered by R@u7 1 · 1 0

when do you mean 'used to be' if you mean before the U.S. bought it (along with the other states a long time ago), then yes.

now, Spanish is spoken in Florida so often because of the immigrants/ new citizens. if you are still confused and havent noticed, Florida is generally closer to Cuba, Dominican Republic, and the whole south america continent, (versus other U.S. states).

and Ponce de Leon named it "La Florida", which i think means something like the land of flowers.

2007-08-10 18:38:19 · answer #3 · answered by phoebe r 3 · 0 1

florida was a spanish territory under the control of the virreyes (viceroys) of mexico. The viceroys was gobernors of a big territory.They were renowned for the king or queen of Spain. Mexico, California, Florida, New Mexico, Texas, Puerto Rico.... concerned to the virreinato of Nueva España (new Spain).

2007-08-10 18:40:51 · answer #4 · answered by Anonymous · 1 1

Florida was colonized by Spain. The name is Spanish for "Land of Flowers". Georgia was put where it was in part to buffer the wimpier British colonies from Florida.

So, yes, Florida was owned and used by the Sapinsh for a couple hundred years.

2007-08-10 18:20:40 · answer #5 · answered by Matthew Stewart 5 · 2 2

Cristobal Colon was an Italian who was hired by Ferdinand...ALL of the Americas (including Florida) used to be Spanish territory.

It wasn't until the victory the English had over the Spanish Armada that England started to be able to exercise some real control on this hemisphere.

2007-08-10 18:14:58 · answer #6 · answered by Dominus 5 · 2 4

Yes. When Spain, France, and the Netherlands owned most of what we now call the united states, Spain got what we now call Florida. It was considered undesirable though, because of all it's swamps.

2007-08-10 18:18:50 · answer #7 · answered by sawlmw2003 4 · 3 1

Yes, the Spanish discovered it and claimed it as their territory. They tried to kill off all the Indians that lived there and almost succeeded......

2007-08-10 18:21:07 · answer #8 · answered by ? 5 · 4 0

Florida use to be a Spanish colony....

2007-08-10 18:17:57 · answer #9 · answered by AZTECAMAN 4 · 1 2

Yes!
OMG! I see a lot of them, since I moved to Florida, may be they are more than American people!

2007-08-10 18:47:19 · answer #10 · answered by Mirabelle 6 · 0 1

fedest.com, questions and answers