English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-01-04 22:47:39 · 10 answers · asked by Anonymous in Society & Culture Cultures & Groups Other - Cultures & Groups

10 answers

Not at all! Every time I leave their office I feel really good b/c my smile looks great and I feel super clean.

2007-01-04 22:51:00 · answer #1 · answered by if i only knew 3 · 1 0

No. I'm fine with dentists. At least they know what they're doing (mine at least). It's the general practitioners that I avoid as much as possible and hate seeing.

2007-01-05 09:04:51 · answer #2 · answered by undir 7 · 0 0

No, I don't.
Mine is good.

But I hate to be in the dentist. Smell, sound and that fr**king bright light.....etc etc. lol

2007-01-05 09:28:03 · answer #3 · answered by Anonymous · 0 0

Hell yeah, had one inject into an absess once and then pull a tooth out without being knocked out, oh, and I was like 6 or 7 at the time.

Now I don't trust them as far as I can spit without my salivic glands.

2007-01-05 06:50:32 · answer #4 · answered by Scott Bull 6 · 0 0

No, but my dentist always says "if you ignore your teeth they will go away."

2007-01-05 06:53:18 · answer #5 · answered by LadyB!™ 4 · 1 0

here is a little joke
You come to the dentist in PAIN
you sit in chair, he create MORE PAIN
and to add insult to injury, HE causes MORE PAIN when you leave <> boom boom!

2007-01-05 07:19:48 · answer #6 · answered by ? 5 · 1 0

No, they help make your teeth look better.

2007-01-05 07:17:21 · answer #7 · answered by Curly Q 3 · 1 0

No. Why do you want rotten teeth?

2007-01-05 06:58:42 · answer #8 · answered by curious 1 · 1 0

No. Mine is great.

2007-01-05 06:48:33 · answer #9 · answered by lou b 6 · 1 0

They are all evil.

2007-01-05 06:51:20 · answer #10 · answered by ? 6 · 0 0

fedest.com, questions and answers