I would not say most but some and they are entitled to their opinion.I really don't care who likes or dislikes the Americans.That is their problem if they don't like us.Not ours.
2006-06-27 16:44:44
·
answer #1
·
answered by John 5
·
11⤊
2⤋
They don't all dislike the U.S. There have been certain situations and people leading the U.S. that puts the entire country in a negative light. Basically it can be the same as racial profiling. Since what they see in their (the world) news about the U.S. is probably not good many people probably assume that all people here are war hungry and or obese and or ignorant and or (insert your own problem here). The media is a strong thing and while the U.S. is one of the most powerful countries in the world it is easy to put down, tear down than it is to do anything else. The U.S. has money and power and there will always be problems with what it does with it both positive or negative. It is impossible to make everyone happy all the time or even some of the time. Even people within the U.S. are not happy with the U.S. It's just like the stereotype that all French people are rude, arrogant, and have body odor issues. That is not at all true of all French people yet many people think this. Are all Americans the way they are depicted by the media? No, but then again for some people it is true. I don't know if this actually answers the question but at least it is something to think about.
2006-06-27 16:55:49
·
answer #2
·
answered by awattysiu 1
·
0⤊
0⤋
I think it's a buildup of the impression people get from americans, also the media comes into play as well as the politics.
Mix that all up together and you have a really ****** up view on Americans. I don't blame them. If I was foriegn and saw all these things about americans I would think Americans are crazy people as well .
I also noticed when I went traveling to different countries, I noticed that it was typically an American that doesn't know how to show respect. A little respect and kindness can go a long way in alot of countries so I do what I can but I get so embarrased sometimes when my fellow American acts so annoyingly superior to everybody...I don't get why we have to act like we own the whole damn world to inflate our egos.
2006-06-27 16:59:05
·
answer #3
·
answered by charming_imogen 2
·
0⤊
0⤋
Most countries are controlled by corrupt leaders or dictators and the media in these nations distort world events. Even in places where the U.S. is despised, many foreigners want to come here.Envy is certainly also a factor.
2006-06-27 16:48:00
·
answer #4
·
answered by nash1dan 2
·
0⤊
0⤋
Pretty much everyone has a different theory. Personally, I believe that we (the US) stick our nose in everyone elses business. I think we should adopt a policy of "don't mess with me and we won't mess with you". If other countries wish to conduct business or diplomatic relations, then so be it. But we should not provide military or financial assistance to any other country. Let's spend that money at home.
2006-06-27 16:41:38
·
answer #5
·
answered by Jesus S 3
·
0⤊
0⤋
Because we are the single world power.
When Britian was the world power (see Victorian England) they were the most hated.
2006-06-27 16:38:44
·
answer #6
·
answered by Sara 6
·
0⤊
0⤋
Because the US government can't seem to mind their own damn business. And being a superpower doesn't mean that you can go around bullying people with airstrikes.
2006-06-27 16:42:10
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
It may be the way, the government treats other countries. Such as invading iraq.
2006-06-27 16:40:33
·
answer #8
·
answered by adhil12 3
·
0⤊
0⤋
Because they are green with jealousy. Have you lived in France lately?
2006-06-27 16:41:54
·
answer #9
·
answered by Mr Answer 5
·
0⤊
0⤋
Because the US gives them every reason not to.
2006-06-27 16:38:23
·
answer #10
·
answered by Sir Real 2
·
0⤊
0⤋