English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

what I mean is do Americans have a more narrow-minded sense of gender roles than other socities. In this country (where I am from and have lived in my whole life it seems that all men are expected by women to be large, tough, strong, and never cry. Is this as true in other cultures? Or is it more true in America?

2007-02-26 03:43:40 · 4 answers · asked by Anonymous in Social Science Sociology

4 answers

Not true in America. Ever since the feminist movement has made great strides in our society, people are beginning to accept more fluid roles of what gender is. We have stay-at-home dads. We have men who take yoga. We have women who are in congress, and who wrestle, and who basically fill "normally" male roles. We still have a long way to go, because women are still raped and physically abused. but those cases aren't quite as frequent and, we're working on that.

2007-02-26 03:48:37 · answer #1 · answered by nicoleblingy2003 4 · 1 0

America (and the West) has a much more egalitarian view of gender roles. Prolly due to the fact that America is where the feminist movement started politically.

2007-02-26 11:47:00 · answer #2 · answered by a_siberian_husky 2 · 0 0

go ask the woman of Saudi Arabia. or the Congo.. big NO

2007-02-26 11:52:46 · answer #3 · answered by Anonymous · 0 0

No No and No

2007-02-26 11:47:10 · answer #4 · answered by The Best 3 · 0 0

fedest.com, questions and answers