I recently saw the movie "The Hills Have Eyes," which depicts violent rape and sexual assault rather vividly. It made me very uncomfortable as a woman, and for a little while afterward I didn't even want my boyfriend to touch me because I felt devalued by what I saw onscreen. It also bothered me that so many men can watch violence toward women onscreen and see it as nothing but entertainment. The movie was horrifying, but I was too angry at it to be scared by it. My anger raised a number of questions. Is sexual violence toward women a valid subject matter for any movie, and is there a right and wrong way to portray it? Does violence toward women in the media contribute to the devaluing of women in our society? Is it right for the entertainment industry to capitalize on a violent crime that is both very real and very devastating to those who experience it? Is there anything that can be done about it, and if so, what? I'd like to know how others feel about this.
2006-06-28
14:28:41
·
10 answers
·
asked by
Anonymous
in
Other - Entertainment