I'm sure they'd all prefer to sell you drugs that work and aren't harmful, but they're in business to make a profit, and they make huge profits, a lot of which they invest in advertising theird drugs to get more people to buy them. They will try to convince you to buy whatever products they make, whether they work or not. Drug companies should not be trusted - they are not trying to help you, they are trying to make money.
2007-06-19
07:46:54
·
12 answers
·
asked by
cdrfishi
1
in
Corporations