What did men think of women during the first world war? (Obviously this is going to be really generalised, but you know...)
Did they believe that the work the women contributed to the war was good, or were they resentful of their jobs being taken, or still stick to the views that women shouldn't work in factory
ies, they should just stay home and knit and, oh, bake pies? (Ooh, I would've hated to live in a time when that was what people though of women!)
Did men's (actually, now I come to think of it, women's too) views change? By the end of the war, were women seen differently? In a good way or a bad way? Had they lost their femenism?
I know I've just asked five questions :) but I'm really into World War One history at the moment! It was a time of such change -- such an abrupt shift from an old, very slowly changing world, to a new one...I find it fascinating!
2007-03-04
06:21:58
·
6 answers
·
asked by
Anonymous
in
History