English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-10-16 07:50:05 · 21 answers · asked by Anonymous in Social Science Gender Studies

All who disagrees can go to hell!

2007-10-16 07:55:30 · update #1

21 answers

True

2007-10-16 07:56:36 · answer #1 · answered by Devil's Advocette 5 · 5 4

This question cannot really be answered. As far as men are concerned, I don't believe feminism has mad society a better place to live in nor has it made society a worse place to live in... just a different place. Of course feminism has made society a better place to live in from a woman's stand point. We can have lives and we aren't forced to depend on anyone, unless of course we want to. You also have to look at the word feminism. Crazy, over-the-top feminists make society worse off from all sides. It makes men hate us when a true feminist wants true equality. They make us appear to be less credible. Anything taken to an extreme is a bad thing.

2007-10-16 08:19:09 · answer #2 · answered by lild304 2 · 2 2

True. I can't see why some people, especially women, don't appreciate feminism. It makes me wonder, "do these women know that they wouldn't be able to vote or hold a job without feminism?"

I think I'd be screwed, miserable, or most likely dead from suicide if it weren't for feminism.. Which is a big reason why I call myself a feminist-to respect those feminists that gave me a voice.

Jennifer-radical "feminism" is about man hating, true feminism isn't. understand you WOULDN'T have an equal paying job if it weren't for feminism.

Laela-well, I hope you won't be voting in the next election, because obviously you don't appreciate that right!

2007-10-16 08:12:38 · answer #3 · answered by ? 6 · 5 1

A resounding TRUE! If it weren't for feminism, men could still consider their wives their 'property' and beat them with impunity. Look up the Tracy Thurman story and learn that the police could only watch while standing by as he stomped on her head (paralyzing her) because THERE WAS NO LAW AGAINST IT - until she sued the state (Connecticut I think) AND the police force - all male at the time BTW. Thanx to her and others like her, women are no longer second class citizens.

And to all you males and anti-feminists who 'think' differently, I strongly suggest you catch up on your history lessons.

2007-10-20 07:23:59 · answer #4 · answered by Anonymous · 0 0

"or"

The answer, of course, depends on what one means by "better". That is, how is one to judge?

Judging by criteria important to me I say that feminism has made society much much better in that (in the very least) it has raised lots of important questions about, for example, the origins of many social norms. From a purely economic standpoint it's probably helped a wee little bit that about half the population is not excluded from certain positions. Just some initial thoughts...

2007-10-16 08:08:37 · answer #5 · answered by language is a virus 6 · 4 0

True in many ways, untrue in others. For example, now that women are usually working, many households have two incomes. So, on a national level, this causes everyone's wages to drop. (This is basic economics and is really not up for debate.) So, it is now even harder for a single mom to make ends meet, even though feminism brought her fair pay for her work. Now she's competing with two-earner households, when before, she was competing with one-income households. So feminism has helped make it acceptable for her to work, and has helped her to be paid the same as a man, but she is still struggling because she can't keep up with the couples who have two incomes.

So, like all changes, feminism has been both good and bad.

2007-10-16 08:22:37 · answer #6 · answered by Junie 6 · 1 2

That is a big question that requires a great deal of thought.
I think that feminism has made life better for women in many ways, but has also had some negative effects as well. These types of issues are rarely all good or all bad.

Men and women were not equal before and are still not equal today. As the teeter totters, things are being sorted out... and eventually will equalize so that both genders can enjoy having both family life, and feel useful and fullfilled while they persue life together, and as seperate individuals.

The only regret that I have about the whole struggle between men and women, is that most times we forget that there may be children involved. It is both parents responciblity to put children far and above any selfish wants or needs that we may have.

In my not so humble opinion, a mother or father who forgets about their children can never truly be a respectable adult.

2007-10-16 08:02:58 · answer #7 · answered by pink 6 · 3 5

That's a matter of individual perspective. Some women would prefer to see feminism completely reversed so they would have more women sharing their traditional ideas. But everything I ever want in my life has been made available to me because of feminism, so MY answer is yes.

2007-10-16 10:10:01 · answer #8 · answered by Rio Madeira 7 · 3 1

False. Sorry, people don't go to hell just because you say the words. Our family structure are falling apart. Kids are being institutionalized instead of being raised by a loving parent. Our boys and men are being pushed to the side like they do not matter. Breast feeding is seen as "too much of a commitment". These are just some of the ways in which feminism has harmed our society.

2007-10-16 09:47:44 · answer #9 · answered by Anonymous · 2 4

For the majority of US women, yes, feminism has made it possible for women to vote, own property, choose who they want to marry, be allowed to get divorced and get custody of their kids, be allowed to go to college, and be allowed to work at more than a handful of jobs considered "appropriate" for women. Men aren't expected to work only outside the home, men aren't expected to marry and have kids to prove they are "mature", and men have more occupations they can choose from, since jobs aren't considered "men's jobs" and "women's jobs".

But some US women, and some US men want women to be primarily mothers and prefer that women work at home, and think men should always work outside the home. Unfortunately, this attitude reinforces the sexist idea that men should only work outside the home and can't be as good a parent as women; and reinforces the sexist idea that women should primarily work inside the home and aren't as good at working outside the home as men.

2007-10-17 12:22:51 · answer #10 · answered by edith clarke 7 · 0 0

That's a really heated question. It all depends on the person so you are going to get many different answers. TO ME I think is has but you have people out there that are so close minded and so old fashioned that they refuse to let anything new into their lives and they will continue to treat people the same...

2007-10-16 07:58:17 · answer #11 · answered by Kelsey S 2 · 1 5

fedest.com, questions and answers