Throughout history, continuing to present day, darker skin color is associated with lower social status. The darker your skin, the less money you are likely to make statistically.
So why is it that pale white people go to tanning salons and pay money to have their skin artificially darkened?
2006-07-18
13:47:44
·
34 answers
·
asked by
I Know Nuttin
5
in
Society & Culture
➔ Cultures & Groups
➔ Other - Cultures & Groups
By the way, did anyone read "The Sneeches" by Dr. Seuss? A tanning booth reminds me of the "star-on" and "star-off" machines.
2006-07-18
13:52:24 ·
update #1
Why is it everyone assumes I "buy in" to the idea of darker people being "less" than others?? This is not the way I see it; my remark was based on the way other people seem to see it. You people are stereotyping ME!!!
I knew I was opening a can of worms with this question. Oh well.
2006-07-18
14:44:51 ·
update #2
I'd like to recommend two books for you: Everything but the Burden and Envy of the World. http://www.amazon.com/gp/product/076791497X/sr=8-27/qid=1153271812/ref=sr_1_27/102-8963185-7388136?ie=UTF8
I think people borrow a little bit from all people, cultures, et cetera. You asked a very good question and I don't think you're racist... to me it proves that you are NOT.
I have only met a few black people who would ever trade their skin color if given the chance. I thank God I was born black. I look at my life as a series of challenges, blessings, experiences, and victories all sewn together by love and perseverance. I can color my hair if I want and my white friend can tan her skin. I can straighten my hair and my white friend can get collagen injected into her lips. Variety is spice, man.
~Peace
2006-07-18 14:21:14
·
answer #1
·
answered by Sleek 7
·
4⤊
0⤋
When white people go tanning, they dont do it so they can become darker. They go tanning so they can get a golden tan; they want to be bronze and not black.
But the reason it may be that thoughout history "pale" people have dominated the social ranks is because for a while Caucasians were the most domineering and empiralistic race. They took control of many things and so they are associated with prominance. With time I'm sure all races would get their chance. But not now since all races are blended quite nicely.
2006-07-18 13:54:30
·
answer #2
·
answered by Nrassm 3
·
0⤊
0⤋
It's really only in a few countries that people tan. In ours it is a way to show that we have money, that we can waste it and that we have leisure time where we can lay around and get cancer. In other countries the whiter you are the better. I'm uber Irish, pale skin, red hair and I went to India a few years ago. I was a goddess. The men flocked to me, some touched my hair and most of them asked if I was married. I just started saying yes, then they wanted to know why my husband let me go out unescorted. But I think that the paleness thing has to do with not having to toil away in the sun where as here being tan means you can afford it even in the winter.
2006-07-18 21:42:44
·
answer #3
·
answered by Kellie M 2
·
0⤊
0⤋
I have found that often when someone has to defend their remarks by saying "I'm not racist, but ..." well, you can see where I'm heading. If you think darker skin is associated with low social status and financial gain, you should tell it to all the physicians living within the Tri-State Area where I live. And frankly, to half the business owners. What is your question, really? Do we buy into your theory of darker people being somehow "less" than lighter people, or is it why do people want to have suntans?
2006-07-18 13:54:36
·
answer #4
·
answered by Rvn 5
·
0⤊
0⤋
For the most part Hollywood. it seems right now that when you look at people on the front pages of magazines and in movies they have a tan. People have the misconception that a tan looks healthy,. but it isn't. It is actually damaging your skin and is the skins way of trying to protect its self by making its melanocytes work overtime. Would you burn you face for cosmetic appeal? Also athletes at large are darker in complexion, due to they actually go out side and do something. Unlike those who would sit around all day in their cubicles then go home in watch t.v. And only sun they get is on the drive home. That's not healthy either to be pasty white, but its better than burning your skin and suffering skin cancer later on.
2006-07-19 12:49:37
·
answer #5
·
answered by confuzioncity 2
·
0⤊
0⤋
It is dumb stereotyping. I have seen plenty of people, of all races, that of a low social status. It has to do with their personal choices, not their skin.
Amendment: When you mentioned the Dr Suess thing, and tanning, you reminded me of an observation I had made. Have you ever noted that many groups will complain about being unfairly treated then do the same themselves? In Phoenix, most people that drive around blasting their bass, playing hate slogans, are Hispanic. Don’t they know that neo-Nazi skin heads do the same? When you are hearing it, at a distance, you can’t make out the words. It all sounds the same.
Racism is stupid. It makes no difference who is doing it, or who it is directed tward!
2006-07-18 13:53:38
·
answer #6
·
answered by Marvin 7
·
0⤊
0⤋
First of all I read some of the previous answers and I see how ignorant some people are for calling you a racist, you see the color of the ones that got offended. Clearly I have asked myself this question before also because you are right. I dont think they are happy with theirselves or their appearance. Remember that statistics are just a bunch of opinions put together so dont be bothered about what they say about darker skin, we will have to battle with these problems forever.
2006-07-18 14:08:31
·
answer #7
·
answered by hodgesandguy 4
·
0⤊
0⤋
I don't think it's true, that darker color skin has been associated with a lower social status.
You live in a western country, so the majority of people are light colored. That's why you might think that people think that dark colored skin has a negative association.
If you lived in an African country, I doubt you would feel the same way about dark colored skin.
2006-07-18 13:52:53
·
answer #8
·
answered by brand_new_monkey 6
·
0⤊
0⤋
Actually you are wrong. In africa there are tribal hierachies that see blacker skin as being higher social status. It is only the western indoctrination that makes you think white good, black slave! And it is sad you feel like that.
As for tanning salons, it is just a way to take money from unsuspecting potential skin cancer sufferers. Some movie actress is probably tanned and so everyone wanted to be like her. its all peer pressure.
2006-07-18 13:52:52
·
answer #9
·
answered by marc k 2
·
0⤊
0⤋
I think it has to do with the western style of what is popular to look like, and it has to do with the whole california lifestyle being currently the way to look. In the east it was considered more beautiful for women to stay out of the sun and have pale skin. To be tanned meant that you spent your days in the sun working. it is just the current popular thing, give it a while and it will change.
2006-07-18 13:53:56
·
answer #10
·
answered by Anonymous
·
0⤊
0⤋