Over the past century, there's been a shift in attitudes about the connection between medicine and identity.
To what extent do you believe one's personal sense of fulfillment connected to the physical? Can medicine help us feel more like our "true" "natural" selves?
Do you think alot of this concept is typically American, & are the shaped identities we strive to obtain, seeded into our minds mainly by the culture?
When we take medication, undergo cosmetic surgery & what not...are we doing it for our own self fullfillment?, or due to an inner inferiority complex that we don't fit the bill of mental health or appearance?, and if it is in fact based on fixing our esteem, and ending suffering...Does it suddenly become a serious medical treatment, comparable to that of therapy?
And, when it comes down to it, is there really anything to be done about it, if this is all relative to general social standards & human nature?
2007-12-10
10:25:18
·
1 answers
·
asked by
Emocide Organ
3
in
Social Science
➔ Psychology