So women are finally finding equality. Well.. in some countries at least.. but has it made a positive difference or are we still somewhere along the line towards "redemption and equality"?
It was reported just today that in 5 major citys in America, women junior recruits are being paid on average 17% salary than their male counterparts. This includes New York, Chicago and a few other major cities. A good thing??
Socially, women are enjoying much greater freedoms, owning property having independent lives, travelling and choosing their direction in life... but is that the case for all women or just the lucky "rich kid" types..
Socially, the ills.. highest divorce rates in history, highest levels of sexually transmitted diseases in history, record levels of youth suicide and teen pregnancy.. not to mention bastardry (children raised without fathers- 70% likley to have unstable lives)..
So are things better???
2007-09-23
17:14:06
·
18 answers
·
asked by
Anonymous
in
Gender Studies