I know its not all of you, but It seems that a large section of the american population have lots of hang ups when it comes to sex. It seems ok as long as you dont look at it or talk about it. If you watch pornography it's considered sick and dirty but many other countries accept it as part of life, regulate it and keep it to a red light district so If you arent into it, you just dont watch it or go there.
I don't necessarily mean just porn but sex in general... I recently read about an outcry in a US state about a picture of a nursing mother feeding her baby. Although there was no nipple exposed, its a completely natural thing & was on the cover of a magazine for nursing mothers available only in doctors, people were outraged.. WHY?
Tell me what you think. I'm curious as to why Sex is widely considered a bad thing in a place where its quite ok to own a gun, people are shot dead every day and its normal to watch extreme violence on tv & movies daily?
2006-08-06
00:29:37
·
19 answers
·
asked by
punkvixen
5