I hear the news and read the articles talking about how those people who obtain a degree make x% more than their peers who don't go to college. I understand how they come up with this, however I look at my situation, and those around me and I just don't see it.
I went to school for 3 years, I dropped out and now I'm now working for a national association and making more than enough money to support myself and my family. My employer knows that I don't have a degree, and while he encourages me to get one he says that as long as I keep doing what I'm doing and using my "street smarts" to improve the business, he'll keep me around.
I have friends who graduated, some who went on to get their masters degree, and most of them are still bar-tending, working for a family business, or working as management in retail stores with no benefits...all while paying off 40-60k in school loans...so my question is.... Is a degree as important as people say it is ?
2007-10-18
05:51:09
·
12 answers
·
asked by
pooljccaa1
2
in
Higher Education (University +)