it is not true. women in the west are as exploited as the women in the rest of the world.think playboy and hollywood. they glorify women as if theyre goddesses but the devil knows what those women went through before achieving their status.lol. i think the only difference is that women in the west know how to play along. they know that the cosmetics products owned by men enslave them, strip them of their cash, just so they become attractive coz thats what the media who are incidentally owned by men portray desirable women. and when theyre desirable, men benefit. and when the men are done, they divorce them. also, divorce is the invention of men. haha so there you go
2006-07-10 19:06:30
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Males of most species on our planet are known to be dominate over females. Although it is true women are suppose to be equal in abilities, intellect, and strength (respectfully), most men still believe that women are weak because what they have been taught. Today, it is not surprising that men don't take women's abilities seriously in being suitable for some positions in local jobs or government. It will take time to undo the damage caused by corrupted men in dark parts of our history.
2006-07-10 18:48:11
·
answer #2
·
answered by The Young Creator 2
·
0⤊
0⤋
Ever heard that quiet people who practice the art of observation more often than flapping their jaws are usually the smarter ones? And that behind every great man is a greater woman? It's like Ranjer says, women "allow" men the illusion of rulership - though I don't know how "smart" that is, given our present circumstances. In this case, the woman behind the man in charge better step up and pull in the reins ... quick!
2006-07-10 18:40:56
·
answer #3
·
answered by patticakewithfrosting 3
·
0⤊
0⤋
Women are equal to men "legally" in most western societies but socially, most of the world remains patriarchal. Legal equality does not equal power.
2006-07-10 18:28:27
·
answer #4
·
answered by quadrophenic1973 1
·
0⤊
0⤋
Look around, and you'll see that we've steadily become a more matriarchal society. We only allow men to have the illusion of rulership because women are the peacemakers.
And look at our "President" ... now that's some leadership!
2006-07-10 18:31:03
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
i dont beleive it to be a sexist remark to say that the male is more dominant, it is the same with all species, except the black widow spider but most anyway, some women find it hard to except the facts of life, dont get me wrong, the world wouldnt function without females but youd find that most women feel secure with a male ruling the roost, i said most not all, its how the world works, it always has but i dont think it always will.
2006-07-10 18:31:16
·
answer #6
·
answered by skippy 3
·
0⤊
0⤋
as through history, men have dominated over the women. there are many women that run the largest companies in the world. many run countries. but the pay is never the same.
2006-07-10 18:28:53
·
answer #7
·
answered by hollywood71@verizon.net 5
·
0⤊
0⤋
Women in the West are not capable to lead in comparison to their Eastern counterparts. They also use too much make-up!!
2006-07-10 18:29:50
·
answer #8
·
answered by Lu Chee Bye 2
·
0⤊
0⤋
Equal means they have the same rights and get the same treatment under the law. It does not mean they have the same capabilities. Also since they didn't always have the same rights, we have some catching up to do.
2006-07-10 18:28:23
·
answer #9
·
answered by nursesr4evr 7
·
0⤊
0⤋
Who's to say, to rule is to be successful?
I know the most brilliant people that work under leadership.
The ruler/ figurehead/ boss is usually only a co-ordinator and nothing more. Uninspired and talentless.
But hey?....If that's what you aspire to....go for it.
2006-07-10 18:37:26
·
answer #10
·
answered by Todd's 3
·
0⤊
0⤋