I've recently read about how in many areas of the world, women bleach their skin to try and achieve some kind of higher status. This behavior even existed in ancient times.
From what I understand one reason was that light skin was associated with not having to perform manual labor. Another reason is that in areas that have been colonized by Europeans, lighter skin was associated with power, wealth, etc.
So my question is: in which cultures has dark skin been regarded more highly than light skin as a result of a long cultural history and not as a modern backlash against the opposite and more common phenomenon.
2006-08-26
05:15:42
·
7 answers
·
asked by
LazyBunny440
2
in
Skin & Body