English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

do they want darker skin or something?

2007-03-11 18:37:56 · 19 answers · asked by Anonymous in Society & Culture Cultures & Groups Other - Cultures & Groups

*sunbathe*

2007-03-11 19:51:50 · update #1

19 answers

there's some idea out there that a healthy tan looks good.

Personally, I embrace my paleness

2007-03-11 18:39:51 · answer #1 · answered by clovisdied 2 · 7 3

White women sunbath because many of them are pale and if they have a tan they look healthier. Also once a white woman gets a tan she could feel more confident about her looks.

Also tanning is a very relaxing thing and it makes one feel much happier after a tanning session.

A lot of white men prefer women who have a little color to them. I know from experience. My ex-boyfriend's friends started commenting on how much hotter I looked because I was tan.

And if those answers don't suffice for you then there's always the "You want what you don't have" answer.

2007-03-12 01:48:21 · answer #2 · answered by littlemarquardt 2 · 2 0

The warmth of mother earth and the spirit of the sun.This is a refreshing birth upon the soul.
And yes they do like to be a little darker.
That's all I have to say about that.

2007-03-12 09:40:09 · answer #3 · answered by blakree 7 · 0 0

We want to have a healthy glow, it smooths out imperfections in the skin... temorarily until the BIG C gets ya. I prefer spray tan. I do go to the tanning bed before I go on vacation, to avoid sunburn.

2007-03-12 03:47:43 · answer #4 · answered by tiffany w 1 · 1 0

Yes, that's what they're after. I've always said "Tan fat is better than white fat". It just gives off a glow, a healthy glow. Other cultures, women use makeup and chemicals to bleach their skin white. Guess some people aren't happy with the way they were born!

2007-03-12 01:41:57 · answer #5 · answered by chimodzimodzi 2 · 7 0

I like to sunbathe to add a little color to this dull, mousy complection. I like the warmth of laying in the sun, along with the relaxation, especially if i am at the ocean. After getting a little burned, it feels like old dead winter skin is sloughing off, i just feel radiant and healthier.

2007-03-12 01:45:03 · answer #6 · answered by vivib 6 · 2 2

Having a tan used to mean it's a sign that you live a life of leisure and do not have to work. There was also a brief period of time this sun exposure was thought to actually make you healthy. I have a lot of brown skinned friends who tan to "even out their skin tone" or "get rid of zits" so they tell me.

2007-03-12 01:51:10 · answer #7 · answered by GranolaGurl 2 · 6 0

To get a tan, nice glow to the skin.

2007-03-12 05:29:41 · answer #8 · answered by benn26k 3 · 1 0

no they just want to keep the skin cancer stats up. Honestly Skin cancer education is at it's peak but you still see some fools lying in the sun soaking up them UV rays.

2007-03-12 01:51:05 · answer #9 · answered by Desperate Mummy 5 · 5 0

No they just want to lie around and get skin cancer. Sorry for the sarcasm, but the answer is pretty obvious....

2007-03-12 01:40:56 · answer #10 · answered by Serenity 4 · 7 2

fedest.com, questions and answers