I can't speak for folks in any other countries, because I have never been there, but folks in the United States generally assume women have it all: education, job opportunities, birth control, love control, financial freedom, and a better chance of success than men. But women still lack the essential freedom — equality — they lacked a century ago. Women are minorities in every sector of our government and economy, and women are still expected to raise families while at the same time earning incomes that are comparably lower than what males earn, usually for doing the same exact job. And in our culture, women are still depicted as bimbos & bloodsuckers by advertisers to sell everything from computers to cars.
Will it take another century or another millenium before the biological differences between men and women are taken as a carte blanche justification for the unequal treatment of women?
What are YOUR thoughts? Please be serious, honest & forthcoming in your replies
2007-05-13
19:34:55
·
12 answers
·
asked by
Anonymous
in
Polls & Surveys