If you have to ask, the answer is no.
2007-11-08 06:51:59
·
answer #1
·
answered by Rio Madeira 7
·
2⤊
1⤋
I'm taking your question to mean, "has feminism died out as a philosophy?" or "has feminism served its purpose and there's nothing to be done any more?"
As to the first, I'd say that "feminism" is an evolving philosophy, and that its early, more extreme branches had such a polarizing effect that the term "feminism" is now very unpopular in our society. If you talk to people about underlying feminist issues without using the term "feminist," like equal pay, domestic abuse, and sharing parenting and household tasks, people are more likely to respond in a way that supports what used to be thought of as "strictly feminist issues." So I'd answer the first question by saying that the label "feminist" may be history, but not the issues.
As to the second question, some of the same considerations apply: we've come a long way with attitudes about what constitutes "women's work" or a "woman's role" as opposed to "men's work" etc. -- in the U.S. and more industrialized world, professions and laws and what is socially acceptable among the genders have changed dramatically in the last 40 or so years. However, there is still so much to be done on a global basis in countries where genital mutilation is still practiced, and women can be stoned to death as a matter of honor because they were raped. I think today's "feminist" (if I can use that term) needs to pick her fights and help her international sisters before she worries about who opens a door or picks up the lunch tab. So in summary, I think there are more productive ways for U.S. feminists to spend their energy. It's now about global human rights, as far as I'm concerned.
2007-11-08 15:24:43
·
answer #2
·
answered by fragileindustries 4
·
1⤊
0⤋
There was some reference to feminism, But most of American history ,taught in schools, glorified the action of white males!
2007-11-08 18:53:18
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Do you mean is feminism a part of American history?
If so then the answer is yes.
Do you mean is feminism taught in American History classes?
If so then all the American History classes I took involved learning about feminism. So as far as I know the answer is yes.
2007-11-08 15:57:26
·
answer #4
·
answered by Fortis cadere cedere non potest 5
·
0⤊
1⤋
I'm not sure exactly what you mean by the question, because I don't know how to find flyinghorse's question.
If you are asking if the feminist movement took place in history, then yes, and it is ongoing today.
If you are asking if feminism and women's issues are given adequate coverage in history classes, then most definitely not, and what tiny bit we have of women's participation in historical events tends to be either misconstrued, or outright lies.
2007-11-08 14:37:29
·
answer #5
·
answered by Drake 2
·
3⤊
0⤋
I believe that feminism or the women's movement has its place in United States History. It had nothing to do with sexual orientation.
2007-11-08 16:17:05
·
answer #6
·
answered by anifan 3
·
1⤊
1⤋
Its always darkest before the dawn. The nazis looked unstoppable in their time, but eventually they and their phony history of race superiority bit the dust. May all tyrannies meet the same fate.
2007-11-08 16:29:54
·
answer #7
·
answered by Anonymous
·
1⤊
1⤋
equal rights are only brought up when it's to their advantage. what do they say when a ship is sinking? women and children first. i bet those unshaven lesbos will quickly twirl their fingers in their hair and start skipping to the lifeboats singing dancing queen by Abba.
2007-11-08 14:23:14
·
answer #8
·
answered by Anonymous
·
3⤊
3⤋
Feminism? Are you referring to lesbians too lazy to do the dishes?
2007-11-08 14:19:34
·
answer #9
·
answered by Anonymous
·
2⤊
7⤋