I am confused by what women claim to be 'Liberation'. Am I the only guy who thinks women say 'Yes' way more than they should? I mean, I don't think people should force others to do anything, but it seems like women (in general) have become more promiscuous and careless about their bodies in the last 30 years.
My dad told me when I was a kid that he (and other guys) were always trying to have their way with girls, but that the girls always said, "no". He said men have always been pigs, but women were the great equalizers. He said back in the early 1960's that a lot of guys tried to get their girlfriends to move in with them, but that the girls usually said, "No".
My question is this: Are we better off, as a society, today, now that women are more likely to say, "Yes" to everything? Are women more respected now that they have the same moral compass as men? Is this what women really wanted when they started burning their bras?
2006-10-02
16:35:25
·
9 answers
·
asked by
envision_man
2