I really don't mean to offend men in general, but I can't help but look at all the havoc going on in the world, past and present, and not say that it has predominately been manifested by men.
Would women make this world a better place? I don't know, but it seems to be that the men aren't doing too good of a job.
Not to mention the affect of the almighty penis has played. Empires have fallen, presidents have scandaled themselves, clergymen have fallen from grace, pedophiles prey on our children and women. All because men can't control their zippers?
Would there be less war and more compassion for people in general? Would it be a safer more economically sound society in general?
Again, men, I am very aware of the wonderful men that are out there. Great ones and heroes. I mean to offend no one personally.
2006-11-06
11:56:57
·
25 answers
·
asked by
Paige2
3