Ok, I am just wondering what you all think. I have been wondering lately how american culture and humans as a whole percieve certain topics. Like, I wonder about how americans see depressed people (just on an average) are they supportive, think they are just whining, think they are weak, fell sympathetic, what exactly? Just wondering because I am depressed so I tend to notice peoples opinions on such matters, but I live in a fairly conservative area with lots of christains and middle class people and therefore any opinions i pick up here may not be very accurate. Also for subjects such as self-harm (cutting & stuff) how does america feel about it? I know many people who are sympathetic or empathetic but i also know people who think they are just wanting attention or being work or all that stuff. So, if you would be willing to share your opinion and how you think america looks on these sort of topics i would really appreciate it. For the later please tell me how it is, not how you wish.
2006-09-12
19:40:37
·
6 answers
·
asked by
Anonymous
in
Health
➔ Mental Health