They owned Canada and the 13 colonies along the eastern Atlantic coast line as far south as Georgia as late as 1776. Spain and France claimed other parts of North America.
2007-08-26 03:59:19
·
answer #1
·
answered by Michael J 5
·
0⤊
0⤋
No - Britain never owned America and neither did the United Kingdom. The Britons were pushed into Wales long before there were any issues in North America. America managed to dump the Crown before it was a United Kingdom.
The English Crown made claim to a small part of North America.
It is a common mistake to assume that Britian = United Kingdom = England.
2007-08-26 12:08:32
·
answer #2
·
answered by Mordecai Jones 3
·
0⤊
0⤋
Yes at one point Britian owned the Eastern coast,Florida, mid America and Northeastern America.However Britian never owned Texas or California,but most of the people you see in California or Texas today are still descended from British people because of the expansion of Eastern America. The British gave Mid-America back to the French,as they where probably more occupied with colonizing in low land areas and mid american consists mostly of mountainous areas that are not suitable for immigrants.The French named mid america and mid Canada New France,but France never really colonized any of the New France land so that's why there are not many americans that are descended from French people today.The British never colonized Florida,and so in turn they gave it back to the Spanish who they ceased it from.The Spanish never colonized Florida either, infact a couple of years after the British gave Florida back to Spain, Spain abandoned Florida.
The Spanish owned South Western america,but again Spain never colonized that land.Most of the prime land best for Immigrants to live on belonged to the British,hence why there are a lot of big cities on Eastern soil today like New York,Boston,Chicago ETC.In 1783 the Americans beat Britan with the help of the French and became Independant,the British colonies where renamed American colonies and this was Early America before the expansion of Eastern America.
Eastern America bought Mid America of the French and then colonized it by sending Eastern Americans there,and then Eastern America ceased Florida and Southwestern America off the Spanish and colonized it by sending more eastern Americans there.Eventually Eastern America then became the America we see today.
EDIT: Do not listen to anything Peter T says, Scotland and England joined Kingdoms in 1602 and then Wales and Ireland in 1603.Britian founded the Eastern coast in 1607,approximatly nine years after formation of Britian so yes Britian was Britian when the Eastern coast was discovered and the land was British not English,and BTW Peter T the British was'nt forced back to Wales,they were forced into Canada,so obviously you don't know much about U.S History.
2007-08-26 11:17:04
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Well, the UK colonized the upper east coast. Other countries did as well, such as the Dutch, the French, and the Spanish. The UK governed the colonies until the rebellion when the colonies fought for and won their independence. From that point, the United States continued to fight or purchase contiguous land until the States took the form we see on the map today. The United States also includes its own colonies; distant states like Alaska and Hawaii, and colonies like Puerto Rico and Guam.
2007-08-26 11:02:58
·
answer #4
·
answered by dougeebear 7
·
0⤊
0⤋
They owned the 13 colonies on the east coast of the current United States of America.
2007-08-26 10:49:37
·
answer #5
·
answered by PseudoSlySpyderGuyLied 3
·
0⤊
0⤋
Britain colonized America, but it did not purchase the land. Britain basically took the land from the Native Americans.
2007-08-26 11:11:38
·
answer #6
·
answered by staisil 7
·
0⤊
0⤋
They thought they did in the late 1700's but we showed them the error of their ways.
2007-08-26 10:50:28
·
answer #7
·
answered by Iknowthisone 7
·
0⤊
0⤋
Some of it
2007-08-26 12:15:38
·
answer #8
·
answered by brainstorm 7
·
0⤊
0⤋