English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

When did the USA and Britain Become allies???? I know during the civil war they wanted to help the South and even as recently as 1901 some British politicians wanted to invade the "colonies". So when did the Brits become our best ally???? And is there still a sentiment that we are thier colonies to them, and some Brits who still want to invade us?

2007-01-28 16:42:49 · 4 answers · asked by Anonymous in Politics & Government Military

4 answers

We have been allies since before WWI. There will probably always be resentment under the Brits skin, but if they had treated the colonies better we would probably still be part of the colonies, or at the very least alot better friends....

2007-01-28 16:50:37 · answer #1 · answered by Taba 7 · 0 1

The US and Britain have always had this love/hate dysfunctional family kinda relationship.

Up until the 1900's there was tensions, but it was generally cordial, Save for the revolution and the war of 1812, because of the amount of trade we were able to share.

WWI the bond grew stronger, however the isolationist movement of the post WWI era began to sever the friendship, many Americans saw great Britain as an anachronism, an imperialist Nation in a time of democracy and pushed very hard to stay out of Britain's war with Germany.

Thanks to Churchill, FDR, and some members of the US Press, Americans started to see the Britons, not as Imperial oppressors rather as the tough little underdogs battling a great evil.

WWII finally cemented the relationship between the two powers.

Well I don't think there are any Brits who want to invade us, and a few people do refer to us Colonials. A number of people still see the America's as rebellious children. A good friend of mine who is Scottish has that attitude.

But at the end of the day we are pretty fast friends.

My friend and i discussed Britain becoming a part of the States rather than join the EU. He felt it would be better if the US returned to being a colony.

Ahhh well.... maybe some day we will be one again. hmmm the state of Scotland.... I like the sound of that.

2007-01-28 19:18:00 · answer #2 · answered by Stone K 6 · 2 0

The US and Britain weren't formally allied until WW1. They did support US efforts in the Spanish-American War, but the US did not reciprocate during the Boer War. The US had no formal alliances with ANY nation prior to that, carefully following Geo. Washington's advice to "avoid foreign entanglements".
This was somewhat controversial, more for Americans, than Brits, due to the large Irish and German ethnicities in the USA. What most folks don't realize, is after WW1, US/UK relations became notably frosty. Right up to 1940, many upper-class Brits, disdained the US (and potential cooperation. Note these were also the idiots who as late as 1941 wanted a deal with Hitler.) The "special relationship" came about during WW2, and pretty much remains in intelligence-sharing, and military cooperation to this day.
Probably the latest example, other than Counterterror, is the US support of the Royal Navy's plans to build two carrier battlegroups for power projection.

2007-01-28 17:31:15 · answer #3 · answered by jim 7 · 2 0

The Brits and the US had an uneasy peace going for them up until the great war (WW1). When the great war happened the US got involved to help out Britain and her allies. After the great war the US and Brittan were strong allies.

2007-01-28 17:49:13 · answer #4 · answered by Nasty Leg 2 · 0 0

fedest.com, questions and answers