English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

Not all of Hollywood is liberal. Just the kooks who get most of the attention.

2006-12-19 10:11:00 · answer #1 · answered by JFra472449 6 · 1 0

hollywood isn't liberal or conservative.
actors/directors/ect from all political parties are hired.
granted mst people on the far right get over the top agitated about things like swearing, nudity, any thing related to magic, any reference to religon, and opther things that occur in quite a few movies
but hollywood is more about money and trends and finding a good audience.
they put out a cia movies for patriots, law enforcement & their advotcates, and historians (naturally fans of the hot star with watch it just for him.
then they have a big horror movie, cute kid movie, romance, comedy, drama, and documentaries so everyone goes and spends money

2006-12-19 11:57:50 · answer #2 · answered by Anonymous · 0 0

The only thing more important than being "liberal" in Hollywood is making money and they would make a movie about dog and cats sleeping together if is made a buck.

2006-12-19 10:23:14 · answer #3 · answered by Carlos D 4 · 1 0

There is no such thing as liberal Hollywood. Hollywood cares about 1 thing, money. Not politics.

2006-12-19 10:09:53 · answer #4 · answered by Take it from Toby 7 · 3 2

Joseph McCarthy lives!

2006-12-19 10:14:51 · answer #5 · answered by Garth Rocket 4 · 0 1

It's not possible.
It must be a trick.
"Beware commies bearing honest films".

2006-12-19 10:13:01 · answer #6 · answered by Anonymous · 0 1

fedest.com, questions and answers