English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

How important is the college you went to... after you have graduated and are looking for a job in your field of study? WHY?

2007-01-26 07:03:43 · 2 answers · asked by k............... 1 in Education & Reference Higher Education (University +)

2 answers

There are some colleges that are known for different things. Magazines like Time and US World rate them on their business schools, law schools, engineering, etc. This only matters in getting top dollar for a job right out of college. If you graduated from Harvard Law, you are going to be making more money than if you graduated from University of Idaho.

But if you have a degree in something, you will be employable somewhere, you might have to look harder and accept less money if it's a small, unheard of school.

2007-01-26 07:16:03 · answer #1 · answered by Sweet n Sour 7 · 0 0

Your experience matters the most. What college you go to is most helpful when looking for your first job. You'll have very little experience, so your grades and the rep of your school factor in more.

Later on, when you do have more experience, they care more about the experience than anything else. Your school's rep only comes in if they need a tie breaker or something.

2007-01-27 01:33:12 · answer #2 · answered by Linkin 7 · 0 0

fedest.com, questions and answers