I've always thought it to be absurd that the Hollywood crowd doesn't ever take any responsibility for the negative effects of some of their movies. The standard irresponsible answer is that "movies simply reflect our culture, we can't be responsible for what people do after watching our movie." (or listening to our music, or reading our books, etc.). I think violent movies tend to increase violence in our society. But am I wrong?
2007-02-11
15:30:11
·
6 answers
·
asked by
Tom Heston
2
in
Social Science
➔ Psychology