English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I just want to know what you think of Hollywood. Would it be all about entertainment? or about glamour? or do you think that everything in Hollywood is but an illusion?

2007-12-03 15:43:19 · 4 answers · asked by .Haleigha. 2 in Society & Culture Other - Society & Culture

4 answers

sex, lies, slander, greed, and a political agenda disguised as art. Just my opinion. TV and Film have become such a huge part of American culture, that I think it's production should definitely be more representive of America in general. Right now, Hollywood owns everything and I think that sucks.

2007-12-03 16:02:20 · answer #1 · answered by Anonymous · 0 0

It's big business. It's about making money.

2007-12-03 23:48:32 · answer #2 · answered by Truth 7 · 0 0

swimming pools, movie stars,drugs, sex,lies...

2007-12-04 03:04:35 · answer #3 · answered by juan_tamad 2 · 0 0

Probably whore of new babylon.

2007-12-03 23:49:38 · answer #4 · answered by Happily Happy 7 · 3 0

fedest.com, questions and answers