I'm borrowing here from Leonard Pitts of the Miami Herald.
"If Hollywood doesn't represent American values, how did the money pile get so big?
Spider-Man 3 made $373 million last weekend. You do not sell $373 million worth of anything in just three days unless you have a pretty good idea who your customers are and what they want.
You kid yourself if you truly regard (Hollywood's) values as some alien belief system foisted upon righteous, defenseless Americans.
Hollywood is driven, like any business, by market forces. In America we vote with our money. And for all the talk about Hollywood as a bastion of liberalism, the truth is this business -- again, like any other -- is conservative.
People are generally conservative when it comes to their money."
Reactions?
2007-05-14
13:39:27
·
10 answers
·
asked by
Anonymous
in
Social Science
➔ Sociology
Look to who owns the studios, not the liberal actors. Those people are in the business of making money; not setting cultural goals that are disconnected from reality. Though, that sort of thing does seep through from time to time.
2007-05-14 13:47:06
·
answer #1
·
answered by Anonymous
·
0⤊
1⤋
Hollywood represents the perspectives of those pulling the strings. there are various in this united states of america that bypass to movies, help television shows, and purchase clothing simply by fact they're advised to realize this with the aid of some public faces. i understand that Hollywood would not signify my beliefes. Gibson tried and alter into accused of being anit-sementic (sp??).
2016-11-23 12:47:29
·
answer #2
·
answered by quartermon 4
·
0⤊
0⤋
It's definitely true. The film making industry works around particular social needs and it does so considering the historic moment. The II WW popularized musicals because people needed simple entertainment. After that period, once war planes stayed in the ground, UFO movies became popular. Lot's of monsters started attacking us.
If you consider things carefully, it is easy to figure out the relation of American foreign affairs and the production of movies related to particular contemporary issues: bad guys were German, then Russian, eventually Japanese, and now Arabs.
2007-05-14 14:22:46
·
answer #3
·
answered by Fromafar 6
·
0⤊
2⤋
Hollywood shocks us first. People go to see the movies because they are bored and want something new to do, and some people want to know how to react in certain situations and feel they can't learn those situations elsewhere (but, they are learning fantasy and not reality), they want something new to learn. then people mirror Hollywood. When they use the fantasy in their own personal lives their relationships fall apart and they don't know why.
2007-05-14 15:08:28
·
answer #4
·
answered by sophieb 7
·
0⤊
1⤋
No.
Hollywood, through its actors, writers, directors, producers and studio executives attempts to bring us their perspectives of our culture. Some of Hollywood just see their craft as expressions of their art and basically give us their view of the world which many of us, in turn, attempt to define it as our culture. That's not their fault, its ours.
Like of all society, Hollywood has its, good, bad and ugly. But, in the end, it is just another facet of our society with their own perspective of the world.
2007-05-14 14:53:23
·
answer #5
·
answered by txguy8800 6
·
1⤊
1⤋
No. Hollywood tries to define our culture.
2007-05-14 13:43:02
·
answer #6
·
answered by TAT 7
·
2⤊
3⤋
Not sure, but it definitely impacts and shapes our culture.
2007-05-14 14:32:55
·
answer #7
·
answered by Anonymous
·
1⤊
2⤋
No, but it shows what kind of entertainment we enjoy, which is violence and more violence. Then again it mirror our minds.
2007-05-14 14:09:47
·
answer #8
·
answered by venom 1
·
1⤊
2⤋
what culture?
we haven't had anything resembling it for years. Thanks in no small part to Hollywood.
make of that what you will.
2007-05-14 13:52:15
·
answer #9
·
answered by Anonymous
·
0⤊
3⤋
I Pray to ANYONE ...LOL
He's Wrong
2007-05-14 19:22:38
·
answer #10
·
answered by Trent 4
·
0⤊
2⤋