Halle Berry's done it, Sharon Stone, Marilyn Monroe. So many have done nude film roles, or done starred naked in Playboy. There are tons of sex tapes out there, and tons of nude pics out there, like the Vanessa Hudgens scandal. Do you think people should voluntarily show their nude bodies? Do you think women should be seen naked in Playboy? Most movies show a naked woman, not a naked man, and it seems to be so sexist. So sexist a woman has to show her skin to sell records, or movies, but a man doesn't have to be that attractive. So...do you think its right, smart, and good, for mothers, fathers, etc. to show themselves nude, whether it be porn, playboy, movies, magazines, etc.? Do you think its trashy? We shouldn't be ashamed of our bodies, but then again when people do nudity they are often looked down on, or looked at as giving it all away, so they aren't really anything. I feel a person's body is sacred, and shouldn't be seen all over the place, just for the person and their spouse.
2007-12-20
10:29:35
·
6 answers
·
asked by
crystal spring
4
in
Society & Culture
➔ Cultures & Groups
➔ Other - Cultures & Groups
its a good thing to think about especially since we are teaching our kids, to be leaders, and to have their own mind and acquire knowledge. anybody can take off their clothes and stand nude, or lounge around, and have sex with anybody nude. a person should know they are more than their skin. but its each person's body and skin. i think a lot of times we say we don't like to be compared to models who are airbrushed, but we buy the magazine covers. its like humanity suffers, AIDS, STDS, etc. are all on the increase, all due to this fascination with sex selling. no one practices abstitence anymore.
2007-12-20
10:54:54 ·
update #1