no.....i don't think we are "takin over" we are definately making our presence known though...and quite frankly i don't want women to take over....i don't wanna become a female acting like a male.
2007-06-11 08:33:15
·
answer #1
·
answered by Anonymous
·
1⤊
0⤋
Funny question, but then again, may be not so...If you look at the demographics across the nation, you will see more female children being born, more women remaining single/or, who are single. Do that in itself equate to "taking over" - I think not. What it means, is that there are more females out there right now in the nation that males. IF the trend continues, we might have a problem, but the fact that women are finally infiltrating every walk of life only means we're progressing and opening up our lives, nothing more.
I believe in balance in all things...relationships, the workplace, etc. When you have this balance, it creates a more interesting life for us as humans.
I can tell you this...I'd be lost if I didn't have a father at one time, brother, significant other to bounce things off of and enjoy life with. Men very much need a women's perspective - we women, very much, benefit greatly from a man's.
To look at it any other way is rather sad to my way of thinking...
Grace
2007-06-11 15:36:35
·
answer #2
·
answered by bunnyONE 7
·
0⤊
0⤋
Women are taking over with out a doubt. The president couldnt be president without a woman by his side to take care of him. Most successful men have there succussful wife next to them to take care of them. You see two people walking down the street both good looking one woman and one man, who is everyone going to look at? The woman
2007-06-11 15:33:51
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
I very much doubt that, although we're no longer in a male dominated society, a lot has changed as a result of the women's liberation. Perhaps someday Hilary Clinton would be commander in chief.
2007-06-11 15:37:48
·
answer #4
·
answered by slimdude142 5
·
0⤊
0⤋
Taking over what? I don't see that occuring in the world I live in, however we have more and better positions than we've had in the past.
As far as I know, we're not trying to "take over"...this world needs both men and women to function properly...the sexes need to work together, not try to best one another.
2007-06-11 15:33:30
·
answer #5
·
answered by . 7
·
0⤊
0⤋
Takin over what???
I think this is a strange question is what I think.
jeese...
2007-06-11 15:32:56
·
answer #6
·
answered by CC Babydoll 6
·
1⤊
0⤋
Taking over what? Generally speaking women still do not hold as many CEO seats as men, nor do they get paid as well as men. That's not right but that is what statistics say. So i'm not sure what you mean by takign over.
2007-06-11 15:34:10
·
answer #7
·
answered by TheAsianPlagueFR 3
·
0⤊
0⤋
Nope in truth men still have all the top jobs. And you women follow. Forget all that Sex and the City bs
2007-06-11 15:32:21
·
answer #8
·
answered by Anonymous
·
1⤊
0⤋
Unfortunately no. I wish they would. The world would be a better place.
2007-06-11 15:34:07
·
answer #9
·
answered by Anonymous
·
0⤊
0⤋
Just in the nick of time too.
2007-06-11 15:34:04
·
answer #10
·
answered by Ray2play 5
·
0⤊
0⤋