This is a very interesting question and I think it might be a little of both. Or a lot of both.
2007-03-14 08:29:53
·
answer #1
·
answered by Becky 5
·
3⤊
2⤋
If you are talking about the USA, then:
1. Women were "given" the right to vote through legislation.
2. Women "seized" (and are still seizing) their right to equal pay and respect in the workforce.
Both were accomplished by hard work and determination against the odds, which is common with every minority.
How did a voiceless portion of the population exercise enough power to take something for nothing?
Women have the ultimate bargaining tool- p*ssy. As long as the human race is sexually active, women will always have sex to strike at the achille's tendons of men.
So even women in extreme male dominated societies (ie. Muslims in the middle east) have some bargaining power.
To answer your other question, society (at least in the USA) cannot take away womens rights as women make up half of that society.
2007-03-14 16:55:45
·
answer #2
·
answered by Anonymous
·
3⤊
0⤋
What a naive question! Society never "gives" equal rights to anybody. The rights are always won as a result of a long struggle.
Minorities win their rights not because they are powerful but because they gradually convince the whole society that equal rights are indispensable not only for minorities but for everyone in society.
Moral progress of human society can not be stopped or reversed (at least permanently). Every minority, including women and gays eventually will have full equal rights in all the aspects of life. It is absolutely inevitable, otherwise human society will not survive.
.
2007-03-14 18:15:03
·
answer #3
·
answered by Anonymous
·
3⤊
1⤋
This question can applied to minorities, as well. Would you say that black people were "given" their rights by white people, or did they fight for them? How did black people, who were systematically oppressed, find the "voice" to demand equal rights? Without the feminist movement, women would still have very few job options, almost NO career options, and be paid very little (comparatively). They would still be expected to stay home when they married, regardless of what they really wanted, and they wouldn't even have the right to vote. Society didn't suddenly "decide" to give us equal rights, we found our voice, and demanded them. As for them being taken away, that's why we still need the feminist movement, because there are some who think we should have all that was fought for and gained taken away.
2007-03-14 16:51:12
·
answer #4
·
answered by wendy g 7
·
4⤊
1⤋
they took their rights and fought for them. if someone works hard enough at something, something will come from it. and eveybody has a voice, they just have to come to the realization that they have one and can use it in their own right. all women's actions came together and made a difference. the only thing that was lost was a world ruled by white, rich men. some people not like it but people have a right to live how they would like. so women aren't stay at home moms all the time. so what? it wasn't the group. individual people choose. that is what came from gaining their rights.
2007-03-14 17:56:11
·
answer #5
·
answered by that_cool_chic 2
·
3⤊
1⤋
As always, women control 100% of sex and 90% of the money. Men were coerced by one, the other or both to amend the constitution specifically for women although technically, it was not necessary.
FYI, women were voting in some states ahead of the amendment to the US constitution that permitted all adult women the vote and several women had already been elected to public offices (by men).
Could the right to vote be repealed? Yes. However, it will most likely be women denying men the right to vote if the past 100 years is any indication.
The problem is that in the battle of the sexes, only one side shows up. Men do not have a unified political voice such as is held by women with the various misandrist organizations such as the National Organization of Whiners.
2007-03-14 16:19:26
·
answer #6
·
answered by Phil #3 5
·
7⤊
4⤋
Women are and always have been the sex that is clearly in charge, both behind the scenes and in fact.
This truth is denied to aid men in feeling superior and thus happier. It also resulted for years in cozy little happy homes. Then women got tired of being unpaid help.
Does the saying "The hand that rocks the crade, rules the world" mean anything?
Good luck.
2007-03-14 18:16:50
·
answer #7
·
answered by Croa 6
·
2⤊
3⤋
Its not about taking or giving rights, its about adjusting mens additude as to the respect every other human on the earth deserves.
2007-03-14 15:44:18
·
answer #8
·
answered by lonewolf07 2
·
7⤊
2⤋