English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-09-06 08:41:23 · 19 answers · asked by Anonymous in Social Science Gender Studies

If women hadn't decided to become independent they would still be in the house taking care of the children where they belong. Then kids wouldn't be out killing kids and using drugs

2007-09-06 08:52:46 · update #1

19 answers

Oh, absolutely!!! If uppity women hadn't demanded they be treated like human beings and full-on members of society, Western civilization would never have declined the way it has! And men would still be opening doors for girls and picking up the check without complaining!! There would be no drugs and no pornography and no domestic abuse and no infomercials and no pollution and no racial problems and no global warming and no recession and no mortgage crisis and no gasoline crisis and no inflation and no war on terrorism and no flea and tick problems and no lead in Chinese-made toys and no cancer and . . . and . . . Jerry's kids would be WALKING!

EVERYTHING is feminism's fault!

2007-09-06 09:02:24 · answer #1 · answered by Anonymous · 9 2

As a woman I don't think the answer is as simple as you make it seem. However, I would agree with you to a point. Women entering the workforce and wanting to do the same as men (note I am not talking about true equality, but equal ability). Naturally men and women are different (not just their bodies but how they approach situations). When the feminist movement went too far to disregard men's contributions and to basically disrespect men, that is when some men bailed out on their responsibilities of being leaders in all senses of the word (I did not say dictators, I said leaders and there is a world of a difference). So yes, the domino effect of a large percentage of women entering the work force when they didn't need to (yes, there were women who needed to, but not near the numbers of women who didn't need to) and downplaying the importance of being a mother first, I believe is a major cause of the breakdown of the American society and morals. However the family goes, so goes society. If families are falling apart it should be no surprise that society is doing likewise.

2007-09-07 02:43:17 · answer #2 · answered by G M L 4 · 0 1

FYI...women have always worked. Always. You're living the 1950s myth--the one occasion in American society when women were at home because they were kicked out of the jobs formerly occupied by men who had come back from the war. The 1950s were an anomaly. So, have morals been destroyed? If you believe that women only have the country's morals in their hands--well, you're wrong. The premise here is also wrong, so it's difficult to answer a question founded on misinformation.

2007-09-06 17:49:00 · answer #3 · answered by teeleecee 6 · 3 0

No, I don't think so. Women have been in the workforce as long as the workforce has existed; they haven't necessarily been at the same jobs, but they've been there. The idea of women staying home to mind the children and the hearth *and nothing else* is a fiction created, if I remember correctly, by upper-class Victorian society.

Before the industrial revolution, we women were at work in farms and schoolhouses, working as hard as men, if not harder than them. Some of us were even merchants. After the industrial revolution, we joined the factories, risking life and limb just as often as men, if not more often because of the loose skirts women wore (remember, women weren't expected or allowed to wear trousers until the 1960's, at least).

American morals and society were destroyed after the Second World War, when women were, middle class as well as upper class, forced en masse into their homes and kitchens... if not in reality, then in the mind of popular culture. If you squelch the spirit and destroy the potential of half the adult population of a country, you can't very well expect the other half to support them *and* all the children by themselves.

2007-09-06 16:54:22 · answer #4 · answered by Cine 2 · 5 0

This society has been quite violent and murderous for a long time--there's a reason the majority of native american indians were wiped out, why the klan has tortured and murdered without impunity, and why immigrants have been and still are hated and murdered for daring to try to start a new life here. American morals and society has always been filled with hate, violence, and murder. Women entering the workforce has nothing to do with the state of American morals and society-American morals and society have never been something to be proud of.

2007-09-06 21:51:57 · answer #5 · answered by edith clarke 7 · 0 0

Women were only considered to have left the American workforce between the depression and WWII. My great grandma told me so, as she, and every other woman in my family were employed their whole lives. Frequently, women worked from home, or had factory jobs at odd hours. Ward and June Cleaver were pretty much one of the last vestiges of the 'exclusive homemaker.' Today's perception is a holdover for the nostalgia of the fiction of the era, like Gatsby. Generally, it was only the wives of the wealthy that had lives of leisure, because they hired servants, most of whom were, as always, women.

It is the same as 'Remember the good old days, when X happened and X were X.' Well, usually, X never happened, and X were never really X anyway.

It is just propaganda. The woman staying home and not being employed was supposed to be an American status symbol. They just only showed the rich people. It went along with the idea that our democratic free market system was the best because our people were so rich, our women didn't have to work!

Poor women always worked, except when they couldn't (like when there were no jobs). Remember that poor people could not vote in this country until well after women could due to poll taxes. This ensured that only the moneyed, society people were involved, at all, in politics. And the abolition of child labor is less than 100 years old. In fact, our industrial revolution was founded on child and women labor.

Like the New Jersey shark panic. Only five people were bitten, but there still persists the idea that there was this wave of shark attacks, and the truth is, it just never happened that way. At that time, people were dying of polio, not sharks.

The American moral is still, as it always was, 'Cash rules everything around me.' We fought for our regional governor's right to levy and collect taxes. Yay, Ben. Women have ALWAYS been a part of our workforce, one way or the other.

2007-09-06 15:57:01 · answer #6 · answered by eine kleine nukedmusik 6 · 12 0

I remember how nice it was that my mom didn't work until my younger sister started school. HOWEVER, the world today is different. It takes 2 incomes to get by. Somehow many single dads and moms are doing it by themselves. Americans morals and society as a whole have declined for many reasons: but I could write a book on that.

2007-09-06 15:59:28 · answer #7 · answered by michelle 6 · 1 1

It doesn't have anything to do with women in the workforce. I'm an American woman and I'm a housewife and when I have children will be a stay at home mom. Mainly because I respect my traditional role in society and because it's my husband's job in our religion(Islam) for the man to go out and provide for his wife and children while the mother stays home to look after the house and take care of the husband and children. I think it has to do with the USA being so Godless. Think about it. Back in the 1800s people were more devoted to their familes and everything was cheaper then it is now. People were more religious then they are now. Now you have Christians who only go to church on Sunday and homosexuals and abortions running rampart through this country. You have girls having sex outside of marriage. You have mixed schools which also contributes to the lack of morals and the reason so many youth are tempted to make kids out of wedlock. If we went back to the good old days when children were seen as adults at 9 years old then maybe the USA would be better.

2007-09-06 18:19:26 · answer #8 · answered by Anonymous · 0 4

No. Our moral downfall is more closely associated with the acts related to giving corporations equal and in some instances greater rights than actual living breathing citizens. See links

2007-09-06 17:18:52 · answer #9 · answered by Anonymous · 4 0

Stop the wars. American women first entered the work force en masse during WW2 when so many men entered the military that no one was left at home to do the work.

2007-09-06 15:54:57 · answer #10 · answered by nursesr4evr 7 · 2 2

fedest.com, questions and answers