What do you think is the influence of the US film industry on the world, socially, culturally and politically? How does it affect the way people think and act, both within and outside the USA? In particular, how does it affect the way people see and judge the USA, including Muslims for example? What is it as a force in the world?
I think people, especially Americans, must really under-estimate the importance of this issue in regard to attitudes, morals and politics, otherwise I'd have had more than 5 answers when I asked it twice yesterday.
Please state where you're from and your religion, ethnicity etc. if you think it's relevant to your answer.
2006-11-17
13:02:58
·
4 answers
·
asked by
Anonymous
in
Society & Culture
➔ Other - Society & Culture