I know that medical Dr's must care about people because of all that they do for people and we can count on them when we have emergency's and we need their help. If you are like any one, you have seen the down right stupid commercials they have for drugs. I really hope most people can see right through the scare tactics of the drug companies. Whatever new disease they put out there that they just happen to have the medicine for. I mean does it really make any good sense to put a man made chemical(Drug) into your body to fix it.. I go to get regular chiropractic adjustments to keep my nervous system functioning better, and take natural medicines.. I don't get all the coughs, sniffles, and junk my co workers all around me do either. If you do the research on it and not just listen to what the main stream public does, you will only benefit yourself. Drug companies and Dr's go hand in hand.. Think about it.. To the person that answered after me, If you read the very first line of the paragraph I acknowledged that Dr's do care.. They must.. Did you skip that part?
2006-11-15 11:48:49
·
answer #1
·
answered by Miranda E 2
·
0⤊
0⤋
I believe that the pharmaceutical industry is simply out there to get your money. The doctors and pharmacists, themselves, may or may not have interest in the health of his/her patients, but all of them have only been taught to use drugs and surgery to "cure" people of their ailments. This, then, ensures that the drug companies will profit from 1) all the drugs that are being prescribed and 2) the sickness that these drugs actually cause, therefore, making sure that people continue to get sick.
There are countless amounts of natural cures that cure illness. However, these cannot be patented, therefore, the drug companies cannot profit from them. Instead, these companies are actually trying to make it illegal to sell or suggest these natural remedies to the public.
In addition, the FDA, USDA, and FTC (all organizations that are supposed to protect the consumer) team with the drug companies to ensure that they both profit, all at the cost of the health of the rest of the population.
2006-11-15 19:20:01
·
answer #2
·
answered by Gabrielle 5
·
0⤊
0⤋
and why do you go to work? To feel good about yourself?
Your tone indicates that you think docs and pharmacists don't really care about their patients. They do. But the amount and difficulty of schooling means we need to be paid!
2006-11-15 22:09:18
·
answer #3
·
answered by jloertscher 5
·
0⤊
0⤋