They controled 10 states on the east coast until the revolution but they never controlled all of what is now the United States (France controlled most of the middle (which they sold to the US), the spanish florida, texas and baja california and the russians controlled some of the east coast and Alaska (which they sold to the US).
2006-07-29 08:24:02
·
answer #1
·
answered by esteban 3
·
1⤊
0⤋
LOL! Yes, England was in control of the USA...this country got it's start as English colonies. The Declaration of Independence was directed at England's dominance of the United States, the Revolutionary War was fought to win that independence. Every July 4th, we celebrate that independence.
Now, shake hands, and go sober up.....
EDIT: after scrolling through other answers
HA! Looks like there will be a few more drunken arguments. Way to put some excitement into a boring Saturday afternoon!
2006-07-29 14:00:12
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
Not since our Independence in the Revolutionary war. England invaded the United Stated in the war of 1812 but did not control the United States
2006-07-29 14:04:21
·
answer #3
·
answered by wy82331 1
·
0⤊
0⤋
Once the colonies declared themselves the United States of America in 1776 they were still under British control until the end of the Revolutionary war. Again during the War of 1812 the US fought against Britain and had the US lost it would have been under British rule again , but the Us won in January 1815.
2006-07-29 14:01:28
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Well no. They had control of the land that Later Became the United States of America but at the time they had control it was not the USA it was the British Colonies in the Americas
2006-07-29 13:59:34
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
England has never been in control of ALL of the USA at once. They have controlled parts (mostly major cities) during the Revolutionary War and the War of 1812 (where they burned Washington DC).
2006-07-29 14:00:04
·
answer #6
·
answered by rb42redsuns 6
·
0⤊
0⤋
Well we were sort of back in the 1700's but than they wanted Independence from Great Britain.
& The Brits backed the wrong side & lost.
Hence the saying the "South's gonna rise again".
We back the South that wanted 2 keep slavery & Never ever got our money back!
Make sure U drink plenty of water 2 ease UR poor Alcohol-addled brain, huh?
Might make UR hang-over a lil easier 2 cope with 2!
2006-07-29 14:01:38
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
No, not ever.
'The American States" known as The America's used to be a British colony. But the United States of America was formed after we Brits were kicked out.
So No, we have never been in charge of the USA.
But for a laugh go to the page below
2006-07-29 14:00:06
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
Well once during the revoulution. But that was in 1700 something. And they weren't really in control of the usa. They just controlled the land. And it wasn't just england, the british did.
2006-07-29 13:58:14
·
answer #9
·
answered by silverboy470 4
·
0⤊
0⤋
Well, Independence Day celebrates the seperation from England. However, I believe the US was not a country until after wards so I'm rolling with no.
2006-07-29 13:58:18
·
answer #10
·
answered by John R 4
·
0⤊
0⤋