English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

American schools don't have masters in medicine. Not a legit school anyways.

If you want to be a doctor, you are looking for a school's MD program.

I've found two schools with medical schools in Miami. University of Miami and University of South Florida. Both require you to either be a US citizen or a Permanent Legal Resident alien to be eligible to apply. Doesn't sound like you qualify.

2007-03-14 16:55:50 · answer #1 · answered by Linkin 7 · 0 0

There is no Master in Medicine in the US, the degree is for: Doctor of Medicine (M.D. or MD, from the Latin Medicine Doctor meaning "teacher of medicine").

First you need to apply for Medical School, most schools require a degree on a related field and to pass The Medical College Admission Test (MCAT) ; later you need to pass the United States Medical Licensing Examinations (USMLE) exam in order to work as a Doctor.

Visit or go on-line to the Medicine School you would like to attend and ask to be send more details. You don't need to be a US citizen or be permanet resident to study here, but you need to aply for a Studend Visa.

2007-03-14 17:08:18 · answer #2 · answered by ? 7 · 0 0

I have never heard of an American school giving out a Masters of medicine. You need to apply to medical school and get an M.D.

2007-03-14 16:42:45 · answer #3 · answered by Lisa A 7 · 0 0

fedest.com, questions and answers