Well again not trying to be sexist,but don't you see everday on the news men abousing,raping,malesting,and murdering women and childern.Now im not saying women dont do evil shity things too.Trust me they do!But it seems like the majority of men are doing this stuff to America.Now don't get me wrong,iv'e heard of women abousing there childern but it's rare and iv'e heard of women abousing their husbands,and again it's rare.It seems though everday men are doing all kinds of things that are tearing America apart and most women has put most of this country together.But im a man and I was raised by men and they personally told me that if there were no women we'd all be cavemen.so it sounds as if most men are makeing it a world for themselves, but if they turn against us then they're gone.So I don't want you to think im sexist but hear me out,Do you think thats whats happening to America?
2006-06-21
22:44:33
·
8 answers
·
asked by
Anonymous
in
Current Events