The FDA is a government agency that is SUPPOSE to protect americans against drugs that could cause more bad then good..but have they been doing their job? Read this article, and/or give me your imput on this subject with out reading it.
Article: Death By Medicine
http://utopiasilver.com/emailtemp/articlepages/deathbymedicine.htm
Is the FDA controlled by big drug companies?
Is it all about the money and not our health?
Are they being too rough on natural supplement companies?
2006-11-30
17:00:46
·
3 answers
·
asked by
peter s
2
in
Health
➔ Other - Health
but the article sure does make you think about all these drugs that are out there...and if they are really helping people, other then just numbing the feeling for a while, like an alcoholic would.
2006-11-30
17:32:37 ·
update #1
There are alot more articles on that web site about stuff concerning the fda, and it's direct association with the drug companies.
2006-11-30
17:34:07 ·
update #2
here are more articles for you people who want to find out the truth about the FDA and it's legal drug pushing.
http://utopiasilver.com/emailtemp/articlepages/archives.htm
2006-11-30
17:46:13 ·
update #3