This is a huge question. In people's opinions are women the dominant sex these days, in my view I think they are for many reasons mostly to do with environmental changes:-
-Women are more career minded
-Women know that men are weak minded when it comes to sex
-Women can easily exploit men (if they so choose to)
My main reasoning is on nights out I have been to, I have noticed women jumping on men and playing games non-stop. Although it could be argued that men could refuse to play these games. Viewpoints please
2006-11-26
23:00:15
·
15 answers
·
asked by
dfds i
1
in
Lesbian, Gay, Bisexual, and Transgender