Women have always been in the workforce. There was a supreme court decision in the early 1900s about women working for longer than the specificed hours. Look at the Triangle shirt-waist fire - female employees. Women were teachers and midwives. Florence Nightingale began nursing at the turn of the century. Marie Curie won a nobel prize in the 1900s.
Children worked right along with their parents to help earn money. Poor women didn't have to stay home with the kids, that's why child labor laws now exist.
Middle and upper class women didn't have to work and there was also a stigma against them working. They are the women who eventually started working out of the home during WWII.
And contrary to what another poster said, feminists NEVER created the perception that women didn't work to push an agenda.
2006-11-14 13:13:46
·
answer #1
·
answered by bored 2
·
0⤊
1⤋
Women have always been a part of the workforce however they were paid a great deal less than their male counterparts. During the second world war women took over many traditionally male jobs due to the men being overseas fighting. When these men returned to the homefront these jobs were taken away from the women because men were still considered the primary wage earners despite the fact that many of these woman were now supporting the family of men killed in action.
2006-11-14 17:06:46
·
answer #2
·
answered by Deirdre O 7
·
1⤊
0⤋
Fat Feminists should stopperpetrating silly lies.. concocted for them to justify their woeful existances..
Woman have ALWAYS been a part of the workforce.. they may not have been executive managers but they may have been nurses.. or cooks..
These days more and more males are nurses and cooks.. and more and more women are doing male jobs.. like light manual labor and "modern jobs" like working for an insurance company.. to say women entered the workforce after WW2 is STUPID!!
Mass economic production happened after WW2.. before that.. people ploughed their fields more or less.. and lived that simplistic life.. woman had no jobs to really do..
Moder ntimes has FACILITED jobs for women in computer fields, in managment and increasingly in politics too.. one bastion traditionally dominated by males.. that is true..
Apart from that.. woman have always played their role.. and feminists need ot stop manipulating the nurturing aspect of REAL women to further your own lesbotic cause!
2006-11-14 22:12:58
·
answer #3
·
answered by Anonymous
·
0⤊
1⤋
Oh I see, the feminists get the blame again.
Woman have always been part of the work force, and any 'feminist' who tells you otherwise is NOT a feminist.
But hey, why listem to me, you've all already made up your minds about what feminism is and does.
(This is not aimed at asker, but the mindless drivle spouted by so many people in answer to your question)
2006-11-15 03:11:29
·
answer #4
·
answered by Anonymous
·
1⤊
0⤋
there was never a time in the US that women were not allowed to enter the workforce. however they didnt have to work untill fairly modern times.
2006-11-14 12:58:25
·
answer #5
·
answered by karl k 6
·
1⤊
1⤋
Women were always in the workforce. That is just a myth created by 'feminists' to gain sympathy and support for their agenda.
2006-11-14 12:20:24
·
answer #6
·
answered by girls_role_model 2
·
2⤊
3⤋
I don't know the exact year, but it was during World War 2. Probably 1942 or so.
Rose
2006-11-14 12:42:52
·
answer #7
·
answered by katierosemathieu 2
·
1⤊
2⤋