English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I haven't been to The United states before so do Hollywood movies really show the true face of life in American society?

2007-07-29 06:20:35 · 11 answers · asked by Anonymous in Society & Culture Other - Society & Culture

David I'm not superficial like you

2007-07-29 06:24:05 · update #1

Lisa this is a serious question

2007-07-29 06:26:05 · update #2

11 answers

Good question. No, Hollywood movies do NOT accurately portray life in the USA any more than Jackie Chan's older movies accurately portray life in Hong Kong, or the "Carry On" films portray life in Britain. American movies are merely exaggerated portrayals of whatever the producers/writers/directors decide to put on screen. Many people who work in Hollywood seem to look down on and sneer at the average American, deriding those of us who do not live and work in New York or Los Angeles as a bunch of ignorant hicks.
We Americans are not even nearly all violent, xenophobic, insane losers. I invite you to come to the USA, travel the back roads, and meet the people. The USA has a culture that is very different from those of Asia, but I think you will be pleasantly surprised at who and what you find.

2007-07-29 06:52:56 · answer #1 · answered by sandislandtim 6 · 2 0

Hollywood is so full of falsehood that they wouldn't know how to tell the truth if it hit them in the face. You will find very good and decent people in the USA that keep their clothes on and they can speak without cursing. At one time before Gone With The Wind they could make movies without cursing and without nude scenes, but not any more.

2007-07-29 06:27:09 · answer #2 · answered by Jeancommunicates 7 · 1 0

Some fiction is based on some truth. Hollywood takes a lot of liberty in their films. No, all phone numbers in the states do NOT start with '555-'. I saw a movie filmed in Norfolk/Virginia Beach....about U.S. Navy SEALS. The passenger told the driver he would get off at a certain hotel. The scene was downtown Norfolk, however when the passenger got out, he was in Virginia Beach! There are many examples of things like that. Also, in cop movies, cops don't always act like they do in movies. Old friend of mine, retired Norfolk police Captain refuses to watch any cop TV shows or movies. He always finds fault with them in what goes on. Military shows like JAG (Judge Advocate General....military lawyers) was so phoney...my wife would get mad at me when I watched them, because of all the incongruities in them.
So, bottom line, take U.S. movies with a grain of salt and remember...it's ENTERTAINMENT.

2007-07-29 06:38:09 · answer #3 · answered by AmericanPatriot 6 · 0 0

Not at all. Hollywood does not portray American culture as it is, those people are so out of touch with reality some times it is scary.

2007-07-29 10:41:55 · answer #4 · answered by Anonymous · 1 0

Not exactly. They exaggerate to make it seem much better than it actually is, lol.

For one thing, not everyone looks like an actor. You'll see much more fat people with bad fashion sense. Also, in most places you are really unlikely to be jumped by a random gang.

2007-07-29 06:23:49 · answer #5 · answered by notmakani 3 · 1 0

NO. You won't see a bald man trying to dodge bullets in every town he goes to (Bruce Willis) but you definitely WILL see an idiotic chimpanzee running the country.

2007-07-29 06:26:31 · answer #6 · answered by ninjaadam 2 · 0 0

Many times - no. Hollywood movies are mostly left-leaning.

2007-07-29 06:23:38 · answer #7 · answered by j b 5 · 1 1

yes but a little bit more exaggerating

2007-07-29 06:29:09 · answer #8 · answered by @NGEL B@BY 7 · 0 0

You're kidding, right??

2007-07-29 06:25:06 · answer #9 · answered by Lisa A 7 · 1 0

no those movies are fiction

2007-07-29 06:22:32 · answer #10 · answered by ? 4 · 4 0

fedest.com, questions and answers