English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

give me examples

2006-10-19 15:44:10 · 8 answers · asked by sexy lady 1 in Arts & Humanities History

8 answers

Colonies was a word used at that time, it basically meant that the colonies were not independent and it's backing came from another country and allegiance, otherwise the word settlement would have been used. They were the beginning of this nation under British rule, however the Spanish colonized here first in Florida and Georgia. The colony was a base of operations to see if it could self substain and produce goods for export back to the mother country. It was a charter or agreement of sorts for profit.

2006-10-19 16:41:53 · answer #1 · answered by AJ 4 · 0 0

I don't believe colonies were important to the foundation of the United States. Colonies were made up on people who were dependent upon the laws of England. Only when men living in "colonies" began to question the "laws of England" did "revolution" occur. Only when men such as John Paine, Benjamin Franklin, Thomas Jefferson, John Adams, James Madison, and Monroe, chose to "jump on the band wagon",did change begin. These men saw an opportunity and pursued it.

Spain and France still owned much of what we now know as the U.S, and were not easily going to be usurped by the monarchy of England. The colonies were a "veritable smorgasbord" of nations, attempting to carve their destiny in a "new world". War formed the foundation of the United States, just as it does today.

2006-10-19 18:39:52 · answer #2 · answered by Baby Poots 6 · 0 0

The colonies were funded by the european powers, the european powers were in turn paid back by exploiting the natural resources and in trade.
The colonies were important in that each colony had its own identity and persona, while at the same time being "in the same boat" as the rest.
Plus, and most elementary, if the colonies had not been founded the average man and woman would not have ventured away from his homeland. Even knowing that land had been found, they would not have left to go somewhere without a support
network.

2006-10-19 16:27:56 · answer #3 · answered by wi_saint 6 · 0 0

the reason why colonies is a fundation to the united states becasue in order become a country first thing you need to be a colony. Colony give you different countries personalities. such as we got abolish slavery from france because france gaves us the statue of liberty becasue they are against it but we say is a immigration thing

2006-10-19 15:57:54 · answer #4 · answered by icac83 3 · 0 0

Because without the colonist philosophy, this country will be just like any country in Latin American. The colonists came and exploited the resources for theme selves and not for England. Whereas the colonists in Latin America came to kill or should I say massacre Indians, exploit the resources such as gold, and then send it to Spain. Those bastards were a bunch of no-good remnants of the war against the moors.

2006-10-19 15:54:29 · answer #5 · answered by Morgan 3 · 0 0

well, the colonies were incentive for people to come to America. without the colonies being established, people would have been much more hesitant to come to America from European countries. the risks would have been most certainly death until there were more colonies and places set up to support the people coming from Europe.

2006-10-19 15:58:49 · answer #6 · answered by christy 6 · 0 0

Before they were states they were colonies of Great Britain.

2006-10-19 15:47:34 · answer #7 · answered by Sinned2471 3 · 0 0

Togehter we stand, seperate we fall.

2006-10-19 21:16:26 · answer #8 · answered by TOM P 3 · 0 0

fedest.com, questions and answers