I haven't heard of any disease being cured lately. In fact I can't see what incentive the drug companies have to find a cure. In fact they seem to be inventing diseases to sale more drugs.
The side effects on drugs seem worse then what they are treating. I lost my sense of smell to a high blood pressure drug.
The DEA approves these drugs even though they kill and injure millions of people ever year.
What have the drug companies done to earn the right to be the only ones able to provide solutions to our health care problem?
2006-06-26
02:38:30
·
5 answers
·
asked by
Anonymous
in
Health
➔ General Health Care
➔ Other - General Health Care