I've always thought people (particularly Americans) place to much emphasis on physical beauty. But do we? What I mean is, if we think it's important, than clearly it is right? We didn't pick human attributes out of a hat randomly and select physical beauty as the "thing we're going to focus on most." Clearly we have a natural leaning towards physical beauty. So maybe that's just the way it's supposed to be. Thoughts?
2007-08-24
07:10:12
·
5 answers
·
asked by
Anonymous
in
Society & Culture
➔ Other - Society & Culture