English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Do they make you feel bad, and actually do more harm than good (like if you aren't as thin as them, etc.). I think that is not the point of the movies, but it seems they do have that effect, and that perhaps they should not show them or should show a warning ahead of time for that reason.

2006-12-30 10:54:22 · 2 answers · asked by Anonymous in Health Mental Health

2 answers

I think that they do make some people feel worse. They might also make others feel like they aren't so alone, though. And they also educate the public and show that recovery is possible.

It's hard to not watch them, but if they make you feel bad maybe it would be good to walk away from the TV.

Good luck!

2006-12-31 09:24:04 · answer #1 · answered by jdphd 5 · 0 0

no i think they make people aware that it is real, and is a wide spread problem. when i was young and had anorexia nobody seemed to know what to do

2006-12-30 11:30:08 · answer #2 · answered by kat_luvr2003 6 · 0 0

fedest.com, questions and answers