English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

It appears to me that they may have a vested interest in protecting their own jobs. I have found many alternative remedies which have proved very helpful for thousands of people, yet the medical profession either havent heard of them or give them the thumbs down. Arent they supposed to have their patients health foremost in their minds?

2007-01-28 10:50:46 · 7 answers · asked by wheels 2 in Health Alternative Medicine

7 answers

Exactly what Janet says: because there is no money in it for the big pharma companies. Herbs can't be patented.

2007-01-28 11:43:15 · answer #1 · answered by Anonymous · 2 1

Because the effects are extremely low. Sometimes they'll do the reverse effect. You see all those pills, those formulas, those techniques that you can buy for $$$ in stores? NONE of them works. Vitamins? Bull****.

The direct answer to your question, and I don't want to be rude or anything, is that medical profesionnals are much more educated that most of us. And tell me why would a doctor get any more money by turning people away from natural remedies? Hey, the doctor can tell them to go try the remedies, wait a few weeks and see the patient again and make more $$$.

Let me demystify you about natural remedies. Only REAL natural remedies work, i.e. eat FRESH fish will help your mental state. Oriental medecine really works, as long as it's done naturally. Those bottled pills and formulas you get in stores, where it's written "natural", but it's only lies, do not work. You want examples? These are taken from scientific articles, not what you read on the pills' label.

Vitamin C? Absolutely no effects whatsoever on anything. Vitamin B? In some patients in works, in others it doesn't. Vitamin D? Actually, it reduces your lifespan.

What the medical profession is trying to do, but more kindly, is to open your eyes to the lies that those remedies represent. I'm talking about mass produced commercial pills and products of the same nature. However, most people who use natural remedies tend to radically believe that it works and will turn away any comments turning them away from their ways. But hey, your loss is the company's gain.

I insist again that real natural exercises (oriental techniques) do work, as well as eating certain fresh products. Anything that's in a bottle or in a sac doesn't work. It's purely placebo, or it'll hurt you. I truly hope you'll listen to me, but most people I talk to don't.

2007-01-28 14:44:59 · answer #2 · answered by jonny_patry 1 · 1 0

The overriding reason is that the pharmaceutical companies that produce presciption drugs represent money in the medical industry. While there are many doctors that are very dedicated to caring and healing their patients, the medical industry as a whole is very profit driven and not always to the benefit of the patient.

There are many positive, medications that have undoubtly advanced the medical profession in so far as improving their patient's health and survivial rate. Unfortunately, the majority of traditionally educated doctors are not trained in natural healing methods because these methods are not part of the money driven pharmaceutical market. There are millions of drugs undergoing clinical trials and only drugs that are targeted for a large usage group and are guaranteed millions in profits are approved by the FDA.

The good news is that integrative medicine is becoming more recognized - it utilizes preventive health measures with traditional drug treatment.

2007-01-28 12:09:24 · answer #3 · answered by Anonymous · 1 1

I think the reasons are twofold. First of all, the typical American doctor is not taught alternative medicine in school, therefore you cannot expect him to be knowledgable about it. Secondly, Western medicine traditionally relies on the "scientific method" when approaching healing and that includes tests for remedies. A doctor whose job and reputation is on the line feels better knowing that the medicine and treament he recommends is backed by clinical research.

I think there are very few doctors out there who just want money and don't care at all about their patients. The vast majority are caring and skilled individuals who practice as they've been taught and in ways that they truly feel will be beneficial.

Fortunately, more studies are now being done on herbal and natural remedies giving doctors the information and impetus they need. Many are now adding alternative medicine to their practice as a sort of "wholistic approach."

This is a good thing as all of the different approaches to medicine have something to offer.

2007-01-28 10:59:50 · answer #4 · answered by Veritas 7 · 1 0

The medical profession only recognises a drug if it can be shown in tests to produce healing effects. The ones which do are considered standard medicine. The ones which don't are called "natural", but the medical profession does not believe they work. They're no more natural than any other drugs on sale, which are almost all based on extracts from natural plants anyway.

2007-01-28 11:00:11 · answer #5 · answered by Gnomon 6 · 1 0

It could just be an issue of medical malpractice suits, which unfortunately would be easier to pursue if the MD used herbs on a regular basis. In America, herbals are not considered "good medicine" and until that paradigm changes, this is the way its gonna' be.

2007-01-28 12:23:38 · answer #6 · answered by Hoolia 4 · 0 0

cause there is no money in it for the big pharma companies

2007-01-28 11:27:55 · answer #7 · answered by janet 3 · 2 1

fedest.com, questions and answers