"Name Brand" schools are a benefit where the prestige of your college will make a difference. For instance, Investment Banking is a field where the school you went to makes a big difference. On the other hand, many jobs like teaching, accounting, and nursing want a specialized certification and it will make little difference where you went to school. Of course, there are exceptions. For higher end private schools, the better the degree, the higher the chance you will get hired to teach there.
Interestingly, they are finding that the name brand degrees don't lead to CEO of Fortune 500 companies -- most of the men (yes, men) holding those positions come from the best public university in their state.
Another study I saw reported that students who got in to HYP but went to a good State University had the same average income as their private college peers, but had lots fewer loans to pay off.
The Ivy League Schools will help you in life if you pursue a prestigious career. They give you an alumni network. They brand you as a "smart, informed, well rounded person." But you can make the breaks for yourself if you are not lucky enough to get in or if you decide on a less expensive route.
Good luck.
2006-12-07 06:27:09
·
answer #1
·
answered by sfox1_72 4
·
0⤊
0⤋
All fields.
I believe that you can get as good an education at a good state school as you can at Harvard or Yale. However, it is also possible to skate through the good state schools without learning much.
The top universities in the country are very selective -- and it is much harder to get out without getting a decent education. Therefore, a lot of companies only recruit students from the top schools.
You may not get a better education -- but you will probably get a better job.
2006-12-07 16:35:01
·
answer #2
·
answered by Ranto 7
·
0⤊
0⤋
I think it's less the field that matters, and more what type of job you will have after school. I am working on my PhD, and for academic jobs (professor) where you get your degree is (almost) everything, no matter which field you are in. I suspect that many other types of jobs are the same.
However, the importance of your college will diminish as you work over the years, and your experience will come to be more important.
2006-12-07 14:27:21
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Law, Business, Medicine
2006-12-07 14:21:48
·
answer #4
·
answered by Blue 4
·
0⤊
0⤋
Post Graduate degrees. Undergrad degrees aren't their focus.
2006-12-07 14:26:32
·
answer #5
·
answered by Father Knows Best 3
·
0⤊
0⤋