I read in magazines, TV shows, etc., how to please a woman how to keep her happy, etc. What about what we want? Society always center everything around a woman and what makes her happy. We have needs to, we want to tell women what we want but it never gets publicized and put into the media, heard on the street, and so forth. Females are spoiled, us males never get spoiled from the opposite sex. What guys want we want to be spoiled and babied too. I'm a tough guy, however, I like the attention too. Do females care that we have needs too or is it a one way street and is it all about you all? All give for us and you all just take? Females let me know if I'm wrong for seeing like this but everywhere we go it's all about the females and how to please them. I'm not talking about just sex either, i'm talking about everything.
2006-06-08
15:26:22
·
9 answers
·
asked by
Shawn J
2
in
Family & Relationships
➔ Singles & Dating