YES, teeth are part of the human body but I think that its different becasue there is just so much to learn in both fields. Dentist are doctors, they just specialize in your teeth. Its just like an Optometrist.. they specialize in your eyes. Just studying the human body its self is very complicated and hard.. Even medical doctors know everything about the body but still specialize in certain fields such as the liver or heart.. etc
2007-02-26 06:25:27
·
answer #1
·
answered by Grace 4
·
0⤊
0⤋
Yes it should be the same as an attorney touching his briefcase although that isn't quite the same as the briefcase doesn't have feelings etc. The thing is that men can get turned on in a massage but it doesn't necessarily mean they want to do anything about it. Jealousy just really tells you how insecure someone is but if you just tell him that he will probably get even madder. All you can do is act in a professional matter at all times. Don't become friends with any clients especially males and don't give him any reason to question you. Trust is built over time. The best books I ever read on relationships is "Getting the love you want" and "Keeping the Love you Find" by Gay Hendrickson. I would also suggest you get counseling and you both get counseling individually too. When someone reacts so strongly about something it is usually a projection of early childhood insecurities and unmet needs. He is trying to recreate something that happened in the past and continue to prove that he is not good enough or whatever it was for him so he is creating this whole thing to get between you and prove that women will leave him. There could almost be some kind of support group for partners of massage therapists.
2016-03-16 01:10:13
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
dental field separate medical field teeth part human body
2016-02-01 01:50:18
·
answer #3
·
answered by Charley 5
·
0⤊
0⤋
You need to go back a couple hundred years......Dentists actually chose to not be included in medicine. Back then, barbers and all types of uneducated people were serving as physicians. Dentists opened dental schools so that they could receive quality training and education. By the way, in some parts of the world, dentists are known as stomatologists. A stomatologist is a medical physician who specializes in dental diseases. Many of them work in hospital clinics, where there is a specific stomatology department.
2007-02-26 07:20:42
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
In dental you use different tools. Also the teeth are made of somethings that no where else in the body are. They seperate it because they made doctors and dentists different. You have to get different degrees. Also when doctors tell you your sick its probably from bacteria or a virus right? And Dental it is just from not brushing and treating your teeth badly. You don't need antibiotics most of the time for dental. Exept maybe pain killer!
2007-02-26 06:24:01
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
What do you mean, why is it separate? Just because a dentist's office isn't located in a hospital doesn't mean that dentistry isn't part of the medical field. Dentists/oral surgeons still have to go to school for many years to become certified for what they do. It's no different than becoming a brain surgeon.
2007-02-26 06:25:26
·
answer #6
·
answered by SassySours 5
·
0⤊
0⤋
Teeth are part of the mouth. Medical is from head to toe, along diagnosing viruses, colds sore throats and other internal organ problems.
2007-02-26 06:28:18
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋