Well, do you? It seems to me that rape has become a blanket term for anything ranging from a true sexual assault to someone's wounded feelings over a bad date.
As a society, we've 'glorified' the act with media; broadcasting the popularity of the accusation, the exploitation, the attention, the advocacy. I neither condone nor approve the rape of men *OR* women, though I do feel as though it's become something beyond it's orginal intention.
Dictionary def. is as follows; "any act of sexual intercourse that is forced upon a person."
Does anyone else feel as though the term has become as common as saying "I love you," both phrases losing their true meanings to become standby phrases and excuses for anything that could happen?
2006-09-20
15:50:53
·
9 answers
·
asked by
Miss Kitae
3
in
Other - Society & Culture