In the past women have needed champions in the form of feminists to get equal rights (the right to vote, work, etc.). However this is not the situation in many places around the world now. Are feminists still needed for our society (in the western world ONLY)?
Why Do women feel they need to become feminists?
If they are still needed and why? And if not then why are feminists still around?
Are feminists now doing more harm than good now or not, and to whom?
And another question, who has it better off men or women (please do not quote tried old steriotypes, but instead truth in form of personal experience or certifable knowlage)?
Please explain your answers. I thank you in advance.
2007-01-28
18:30:45
·
7 answers
·
asked by
Arthur N
4
in
Social Science
➔ Other - Social Science