I must preface this by saying that I mean no injury to either sex by asking this I am merely courious. Please do not take this as an indication of me being sexist for I am not. But here goes:
Men have also been I a position of power over women, but why has it been like that? The only logical conclusion that I can come to is that men are just inherently better then women, so from the very begining men have had a physical, intellectual, etc. advantage over women, and thus have taken the dominate position. Thus question could also be extended to other races. The white race has always been dominate, so is there an intrisic aspect of the white race that makes them better? Or has it been mere luck that the white race and the male sex have succeded in subjuagating other sexes and races?
Just some question so stew over
2006-10-09
10:07:42
·
10 answers
·
asked by
The Duke
2