Halle Berry's done it, Sharon Stone, Marilyn Monroe. So many have done nude film roles, or done starred naked in Playboy. There are tons of sex tapes out there, and tons of nude pics out there, like the Vanessa Hudgens scandal. Do you think people should voluntarily show their nude bodies? Do you think women should be seen naked in Playboy? Most movies show a naked woman, not a naked man, and it seems to be so sexist. So sexist a woman has to show her skin to sell records, or movies, but a man doesn't have to be that attractive. So...do you think its right, smart, and good, for mothers, fathers, etc. to show themselves nude, whether it be porn, playboy, movies, magazines, etc.? Do you think its trashy? We shouldn't be ashamed of our bodies, but then again when people do nudity they are often looked down on, or looked at as giving it all away, so they aren't really anything. I feel a person's body is sacred, and shouldn't be seen all over the place, just for the person and their spouse.
2007-12-20
10:29:35
·
6 answers
·
asked by
crystal spring
4
in
Other - Cultures & Groups