English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

No it doesn't, but it does increase your chances of a well paying job. Not going to college just about guarantees a low paying job.

2007-09-24 15:19:51 · answer #1 · answered by Max 7 · 0 0

Not right away. Employers also want you to have experience in that field. Now... there are some jobs that pay a lot more if you have a college degree. The higher the demand, the higher the pay. I have read a lot of articles...and when you are in your 30s is when you start to make more money (because you are no longer doing the entry level jobs).

2007-09-24 22:24:11 · answer #2 · answered by hot47qt 4 · 0 0

NO!!!!! Just because you have a degree doesn't mean that industry will be in demand by the time you graduate. The only thing that SEEMS to be a sure investment is certain aspects of the healthcare industry since people will always get sick in this system, and law since people become more and more lawless as the days go by.

2007-09-24 22:18:36 · answer #3 · answered by peaceablefruit206 7 · 0 0

you need to earn some sort of degree in order to get a job
and it also depends on what field of study the degree is in
no it doesnt GUARANTEE it.. but it helps a great deal now-a-days

2007-09-25 00:18:44 · answer #4 · answered by Mimi 4 · 0 0

Nope

2007-09-24 22:21:27 · answer #5 · answered by Anonymous · 0 0

No. Infact you'll probably work for someone that has less education and makes alot more money.

2007-09-24 22:32:12 · answer #6 · answered by Becky J 4 · 0 0

fedest.com, questions and answers