English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

13 answers

They don't seem to. They really know nothing about the rest of the world. The reasons are:
1 - They believe that all the world spins around them.
2 - They only care to learn when it means money. Otherwise, they say it is a waste of time, they they don't need to know anything , except how to make money

2006-11-14 07:14:28 · answer #1 · answered by Dios es amor 6 · 0 3

Like most people around the world, most Americans don't travel to other country, know few people who do travel, and only know a distorted view of the world from the mass media (which includes books).

I suggest people to travel outside their country in order to become more patriot to their own country and developed a better understanding of their own country and the world.

2006-11-15 08:36:43 · answer #2 · answered by E A C 6 · 0 0

Because we, as Americans, are pompous and believe the rest of the world loves us. Therefore, we make no effort to get to know about the rest of the world.

And we wonder why the rest of the world hates us so much...

2006-11-14 15:30:28 · answer #3 · answered by umwut? 6 · 0 1

Maybe because anytime something happens in the world, everyone comes running to America, thus making us feel that we are the only ones who can bail everyone else out.

Oh, man, I keep editing this answer. How many countries do we Americans have our fingers in? What a stupid question.

2006-11-14 15:15:06 · answer #4 · answered by mei-lin 5 · 0 1

My own opinion of this is because many Americans don't care unless it's happening in their neighborhood. And they are too wrapped up in the material items to take the time to read.

A big problem lies when Americans complain about things elsewhere when they have no clue as to the history and current events of other countries. We are the richest country, supposedly, yet we are rather lazy compared to other countries. Many Americans embarrass me. Knowing what is happening outside our borders is important to sustain withing our borders. Besides, we all live on one planet, we need to keep this planet moving along for future children. Knowledge is power.

2006-11-14 15:15:25 · answer #5 · answered by Anonymous · 0 2

Movies, television, media all seem to focus on just the US. They are fed propoganda throughout their whole lives about how their country is the greatest in the world and that somehow everyone that is not American is somehow beneath them. Also, the fact that they are the 4th largest country in the world by area means that they can drive for days and never leave their own country.

I would say that most Americans just don't care what's outside their borders.

2006-11-14 15:14:06 · answer #6 · answered by dunc1ca 3 · 2 3

That is the general stereotype of Americans, and we all know, Stereotypes aren't always true.

I for one know enough about other countries to be respectful and insightful if I were to visit. Respect is common to all countries, Americans just seem to throw it out the window. They shouldn't know youre an american when you visit their country, because then you're filling out that stereotype.

2006-11-14 15:12:32 · answer #7 · answered by stu b 2 · 2 0

1. It isn't taught in our schools and should be.
2. We are too busy trying to make a living.
3. We are too busy trying to make heads and tails of our own.
4. Most cannot afford to travel.

2006-11-14 15:46:54 · answer #8 · answered by daisy 4 · 0 0

For one thing, we are isolated here with huge oceans on either side of us. Most of us cannot afford to travel abroad. So, what I do know I get from books, TV, and others who have traveled.

2006-11-14 15:12:45 · answer #9 · answered by Gorgeoustxwoman2013 7 · 2 0

Because since the world comes to them, they think they know the world.
I guess.

2006-11-14 15:12:28 · answer #10 · answered by cass 7 · 3 0

fedest.com, questions and answers