English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-02-08 10:54:26 · 20 answers · asked by Anonymous in Social Science Gender Studies

20 answers

Feminism has improved UK society by bringing about positive social changes which give equality and send discrimination. The only person that it would hurt would be the ruling class and they would have their power questioned.

2007-02-10 09:26:10 · answer #1 · answered by Deirdre O 7 · 0 1

Do you live in the UK?

What with the class system, the historical problems, the differences between counties and areas, the milions and millions of new additions every day which create tension and overcrowding, the dishonest politicians, the complete lack of acceptance of anything or anybody north of Manchester,
the bigotry, the weather, the Scots, the Welsh, The Irish, the Islanders, the stiff upper lip, the public schools that are not public, the monetry system that is designed to keep the poor people poor....
Oh yes, lets blame feminism.

2007-02-12 17:20:25 · answer #2 · answered by sylvia a 3 · 0 0

This has got to be the poorest set of answers I have seen for a long time. Of course it has damaged society. The male role has been undermined in every arena of life. Marriage, education, the workplace, morality (abortion, the so called woman's right to choose. The child doesn't get a choice), the law. The whole country has been feminised, feminine attitudes have pervaded into every arena of life. Even the Government has become a bossy, touch feely, nannying and controlling bunch. The media certainly has.

If a man so much as looks at a women the wrong way, particularly in the workplace, he is guilty of sexual harassment, with all the anti-male legislation brought to bear on him. Violence in marriage and other types of relationships is deemed to be solely a male thing, whereas, it certainly isn't.

This is actually much too big a subject for yahoo answers, but it must be depressing for feminists, judging by the other responses, that the men don't seemed to have even noticed the effects of feminism. Incidentally, there are a lot of male feminists and other sycophants.

2007-02-10 08:02:40 · answer #3 · answered by Veritas 7 · 1 1

Try framing the question as 'UK society has changed, how much of this is due to a successful feminist agenda?'
Women are more prominent in the workplace, but this is in part due to the jobs available in a post-industrial economy.
As a corolary of this many men whose roles were defined in terms of traditional jobs feel confused and threatened about their masculinity.Legislation has been introduced which seeks to ensure equal opportunities for the sexes in employment.
Many women are now showing an increase in conditions which were previously predominantly a male health issue.
Some women are postponing childbirth in order to pursue career opportunities.
Due to the pill, women are able to indulge in promiscuous behaviour on a scale rivalling their male counterparts.
So feminism has probably contributed to changes in society and the commerce between males and females therein.

2007-02-08 19:41:04 · answer #4 · answered by troothskr 4 · 0 2

personally, i would say yes. one of the reasons why, is that there are still so many inequality issues which have yet to be addressed, as well as new ones which seem to prop up, these days. of course, women are often inadaquately treated and things like pay is still very much a problem that hasn't been resolved. and yes, women are often portrayed in ways that ridicule and stereotype women in general. but the truth of the matter is for me, i don't see what all this gender-bashing leads to. and there are some feminists, who wish to condemn and attack the men, yet some of their reasons for doing so often don't add up. feminism for me, is something that even though it's important for women to highlight and echo such gender differences, it shouldn't involve having a go at men all the time. and besides, two of my best mates are male, as opposed to female, and so i don't see why men and women cannot get along together because the truth is we can

2007-02-09 15:52:36 · answer #5 · answered by Anonymous · 1 0

Yes I would say it has. I remember so many gentlemanie things from my youth - which are completely out of the window and a lady likes to be treated like a lady misses that. I do not think it had to go across the board this feminism thing - in some work places it needed it, but as usual we go overboard, all or nothing so to speak.

2007-02-09 11:45:33 · answer #6 · answered by deep in thought 4 · 3 0

No.

Feminism is about people being treated as individuals, not about treating people according to stereotypes.

We need more feminism, not less: British society will prosper when people are able to follow their talents, not typecast by gender.

2007-02-10 08:47:08 · answer #7 · answered by Anonymous · 0 0

Feminism has made the United Kingdom (and the world as a whole - including my country, America) a better place to live.

Women are now free to live their own lives, make their own money and participate in society as equals to men, rather than being downtrodden household drudges tied to the kitchen, the nursery and the bedroom!

2007-02-08 22:02:13 · answer #8 · answered by Anonymous · 0 2

I would agree with the answer above me. Personally, I thoroughly enjoy being outnumbered by the women in my office.

As well as the promiscuity, women have embraced alcohol in a big way, and almost all the binge drinkers I know are women aged between 25 and 45

2007-02-08 20:06:11 · answer #9 · answered by Anonymous · 0 2

Crikey you're a bit late off the starting blocks, feminism has been around many many years!

2007-02-08 21:29:34 · answer #10 · answered by DIANNE M 3 · 1 1

fedest.com, questions and answers