I am all about strong, buff women who can hold her own and kick a** if necessary, but I hate the new trend in movies.
Lately, there have been a lot of movies where the tough chick gets into a physical fight with a man. Even if she "wins" the fight, he still gets a few good punches in.
What happened to teaching boys that it's never okay to hit a girl?
Movies like these make it look like girls can hold their own if a fight with a guy. That women should be able to defend themselves like most guys can.
Even if the girl is superhero strong (which is almost never the case in real life) and can actually win a fight with a man, I don't think that movies should depict men punching and kicking women.
Any thoughts?
2007-03-28
17:21:18
·
13 answers
·
asked by
loves2fly84095
4
in
Psychology