I bought a book "The War Against Men--Why Women are Winning and What men must do if America is to Survive." In just the first pages of the preface, I'm learning of some serious claims by the author that American men are under attack in nearly all areas of life and that women have made tremendous gains in the last 30 years. There's something about Feminism infiltrating the public school system. I can't wait to read on. What do you think? Is there a war against men in America?
2006-09-17
10:32:08
·
7 answers
·
asked by
Anonymous
in
Society & Culture
➔ Other - Society & Culture