English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

8 answers

You wish doctors try to cure the sick...because these days doctors want money. Scenario 1. You go see a doctor because you are sick. The doctor writes a prescription and charge you an arm and a leg but you are still sick.

2006-07-16 09:28:17 · answer #1 · answered by RunSueRun 5 · 0 0

I could talk about this for hours. Don't be fooled. The doctor and drug connection is a billion dollar industry, giving no thought to people's health. It is strictly a "business". That's what they DO. Granted....there are some legit doctors, but they still push drugs. Drugs do nothing more than mask the symptoms! I am 60 yrs. old and have always used alternative medicine and have never needed a doctor and have had no surgeries. If you want a good doctor, find a holistic doctor in your area. They will help you with natural remedies which have no harmful side effects AND will actually remedy your health problem! And something else....yes, there IS a cure for cancer and AIDS! Anyone who tells you differently doesn't have their facts. Above all, always protect your immune system and research thoroughly before making your own medical choices....but absolutely DO make your own health care choices. If you don't take control, no one will do it for you.

2006-07-16 08:55:03 · answer #2 · answered by ILCatLover 1 · 0 0

Money talks. I knew one doctor who was kind, helpful, actually helped his patients. The other doctors in this area bullied him and spread horrible rumors about him to run him out of town. He ended up closing his practice because he was fighting against too many people who want to keep patients sick and bouncing from doctor to doctor searching for help.

The Truth About the Drug Companies: How They Deceive Us and What to Do About It (Hardcover) by Marcia Angell

Selling Sickness: How the World's Biggest Pharmaceutical Companies Are Turning Us All into Patients (Hardcover)

Inside the FDA: The Business and Politics Behind the Drugs We Take and the Food We Eat (Hardcover)
by Fran Hawthorne

2006-07-16 08:48:40 · answer #3 · answered by Anonymous · 0 0

It depends on the doctor you're talking about. You don't really know what's going on in the doctor's head. Many people will be divided in opinion on this matter. Visit this link: http://www.mercola.com/2000/jul/30/doctors_death.htm
It says doctors are the 3rd leading cause of death in America. Maybe doctors do not care about the patients' health enough to do anything. Doctors have a job that involves them to take care of many people. Think of them as a fast food restaurant in analogy. They're limited. They focus on one aspect of a patients' problems. You can not expect doctors to help or solve everything or closer to it. We need to also understand that doctors themselves are just like us(normal and average) besides them having a great knowledge.

2006-07-16 08:51:50 · answer #4 · answered by howie 2 · 0 0

Some doctors go into medicine for the prestige, some for the money, others cause they really do have a desire to help others. Then there are those that do it for both money and desire to be of help. Overall I think almost all doctors do their best regardless of the reason they chose.

2006-07-16 08:50:15 · answer #5 · answered by WILLIAM R T 3 · 0 0

depends on the doctor... i am sure u are aware that there are many types of people in the world. but i do k now that doctors do take an oath to take care of people. but yes some do it for the money, but others do it for both, they like to earn what they get paid.
i know a person who was going to get surgery a doc. told him I'll do the surgery tonight, but while in another hospital they made tests and got him ready, physically and mentally. so it does differ

2006-07-16 08:48:30 · answer #6 · answered by 3umar 3 · 0 0

My doctor has a joint venture with a pharmacy next to his office. So if it's something like that I think that if it's for something very minor (like a cold) they'll prescribe you something from there to help you get over it quicker. But if it's major, I'm sure the majority of them will do their damnest to help you, regardless of the money.

2006-07-16 08:49:28 · answer #7 · answered by Anonymous · 0 0

cure the sick. those who enter medicine for the money soon become disenchanted with the field and hate their jobs. For those who want money, going into a business field will allow you to make millions

2006-07-16 08:46:05 · answer #8 · answered by Bananas 2 · 0 0

The vast majority of doctors care about people, not money.

2006-07-16 08:43:48 · answer #9 · answered by Meg 5 · 0 0

Money money money ! Try getting treated without insurance!

2006-07-16 08:44:48 · answer #10 · answered by Retarded Dave 5 · 0 0

fedest.com, questions and answers