English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Today the british is all over america. the reality shows, as producers, music, setting examples towards good living, and way of life. it is a fact that they give america their independence and america changed the english language first and many other things but the rest of the world remained the same and now americans are thinking every thing they say and do is right but not knowing there history. if 360 countries in the world say and do the same as the british why america thinks what they say and do is right and better. and yet still foreigeners seems to succed more than most americans with their british mentality.

2007-02-23 07:53:39 · 7 answers · asked by MR DESTINY 2 in Society & Culture Other - Society & Culture

7 answers

What makes you think they want it?

2007-03-02 17:24:55 · answer #1 · answered by Mr. Been there 4 · 0 0

Well, I think your problem is a little too general. ARE the Brits taking back America...fat chance. America, despite what is currently taking place OUTSIDE our borders, has a strong foundation of independent-thinking entrepreneurs. I don't believe we all think we are better than every other country either. We have made huge advances, as a nation, in engineering, technology, marketing, agriculture etc. Also, you are wrong about the British "giving" us our independence. You might want to check YOUR history there. We had this little thing called the American Revolution. We actually took our independence from the Brits. Now as far as success goes, check the facts again. There are more millionaires per capita in America, with the exception of a one or two oil rich kingdoms, than any other nation. I'm not sure where you came up with your ideas, but you are a little confused there.

2007-02-23 08:18:33 · answer #2 · answered by Alchemist 4 · 0 1

I'm from the U.K, and I think if anything we are becoming more and more Amerianized. You're saying that we export lots of entertanment to you, but on balance you probably export more to us. Pretty much every film you can see is American, a lot of TV and adverts are, brands, music...

With the end bit I can't really tell what you're saying...

2007-02-23 08:08:20 · answer #3 · answered by Wanttoknow 2 · 0 0

Americans don't all think they are right or that everything they do is right. American English is more correct than theirs is i never understand British English at all they don't even say their H's.

2007-02-23 08:12:32 · answer #4 · answered by MuRdEr 4 · 0 0

What are you trying to say?
Are you saying that America is starting to have more British values or that they think they are better?
I'm confused.

2007-02-23 07:59:33 · answer #5 · answered by Sarah* 7 · 0 0

What do you mean "taking back America?"

It wasn't theirs in the first place as it wasn't the Americans. You all are foreigners.

2007-02-23 08:03:42 · answer #6 · answered by ViolationsRus 4 · 0 0

No, I don't think that at all. Go to the UK and see how much American television and culture has taken over them. It's insane.

2007-02-23 07:58:12 · answer #7 · answered by herbritannicmajesty68 3 · 0 0

fedest.com, questions and answers