I cannot imagine any sexual act, even using it's name makes me feel dirty, more degrading or disgusting for a woman and yet I see people every day attempting to say that it is somehow acceptable. Don't people realize that it will destroy your rectum and leave you wallowing in your own filth? No decent woman could ever enjoy anything so base or perverted and yet people, some even claiming to be women, continue to say otherwise and even mock those of us who attempt to tell the truth. What is wrong with people?
2007-02-07
09:03:01
·
4 answers
·
asked by
Anonymous
in
Family & Relationships
➔ Marriage & Divorce