yes, i think, but you don;t neccessarily NEED one to get good wages in the future, you just have to work a little harder and you can end up making money without a degree
2006-08-24 10:45:17
·
answer #1
·
answered by bluemoon 3
·
0⤊
0⤋
Okay, common sense and society says to get something beyond a high school diploma meaning a degree. USUALLY, the better-paying companies would rather hire someone WITH a college degree than someone who only went to high school, so I'd say it's smarter. However, you're not guaranteed to have better wages or a better life. People who believe that are stupid. College GUARANTEES you *nothing*. It is merely another step you are encouraged to take so you can increase your chances of getting a better job. It all depends on your personality and what you really want to do in life.
If you're having trouble with what to major in, just do something you have more of an inclination towards. Or else, you can always go in "undeclared." It really doesn't matter, though, because all that matters is that you went to a college with a good name and receive that little piece of paper. Everything else means jackcrap. So, you'd might as well pick the easiest major, and just have fun. It's really just a fallback option for you unless you're going to Harvard or something for law school so you can become a lawyer (bleh).
2006-08-24 17:52:11
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
It may guarantee better wages or not depending on the career. However, it will likely open more doors than it closes.
Many people don't realize that just going to school should be enough to get that job. Experience is also key and it can be gained through volunteer experience while going to school. If you want to make the most of your education, volunteer in your field and/or take up and internship while in school or during breaks. It makes a difference.
That said, I constantly tell young people to pursue degrees unless they can't for some reason. I do this mostly because my husband didn't. He went to trade school and became a journeyman laborer which pays fairly well. However, after a while, he began to experience extreme rheumatoid arthritis--and before the age of 30. He had to leave his job and go back to school.
Thank goodness he did, because his body had deteriorated so much that before he left school, he required 4 surgeries to put his body back together.. So, if you plan to work in a field that requires your physical strength, consider at least getting some education (maybe in business management) so that you hedge your bets that your body will continue to be healthy throughout your working life.
2006-08-24 17:48:49
·
answer #3
·
answered by BeamMeUpMom 3
·
1⤊
0⤋
I just finished my masters and I saw a pay schedule for teachers. The more college credits and advaced degrees you have, the more money you will make. My salary has increased over the years as I've worked hard and made the sacrifice to go to school... so YES, I agree that a degree guarantees a better wage.
2006-08-24 17:51:24
·
answer #4
·
answered by Sam M 3
·
0⤊
0⤋
There are no guarantees in this world and that also applies to your degree. I have my MBA and a BS degree and I work along side a person with no college education. It all depends on the field you are in and how your education relates to it.
You can have all the degrees in the world but if you are working in a field that does not respect those then your wages will be low.
I would still recommend that you get your degree but do it in a field you are very interested in so it will apply and help you get better wages.
2006-08-24 17:50:25
·
answer #5
·
answered by Colorado Answers 2
·
0⤊
0⤋
It increases the probability. A lot depends on what profession you want to pursue. Some companies won't give you a second look if you don't have a degree; others will look more at what practical experience you have.
To many people a degree shows 2 things, you have the willingness to work at something and it implies that you have a well rounded background (educationally speaking).
2006-08-24 17:53:42
·
answer #6
·
answered by Ro-bot 5
·
0⤊
0⤋
Well no...it doesn't GUARANTEE a good salary, but it will take you to places that a high school diploma can't do. Hey, just give it a chance and try it out. You may like it. Although college isn't for everybody, alot of people benefit from their college degree. Maybe you should go to a community college for a semester and see if you like it. Figure out what your interested in and take the classes regarding that subject. As long as it's interesting to you....you will love it!
2006-08-24 17:49:33
·
answer #7
·
answered by janeywb 4
·
0⤊
0⤋
It will not guarentee it but to the chances of getting a good job does increase. My brother in law is a human resource manager for a large company in California. They no longer want a entry level floor manager that does not have a 4 year degreee. The only way you can work for the company w/o one is a assembly person. They make 13 per hour in California.
2006-08-24 17:51:07
·
answer #8
·
answered by eimmahs 5
·
0⤊
0⤋
Choose your career path wisely. See a career counciler. They can tell you what careers are lucrative and which ones are more of a spiritual calling(meaning people go into the for the love of the job and not the money)
Any degree would give you better wages that if you had none--and more career options.
2006-08-24 17:55:16
·
answer #9
·
answered by kathy r 3
·
0⤊
0⤋
it is not a guarantee. I have a degree, and the field I went to school for is impossible to get a job in unless you have experience outside of school, and you cant get experience because you cant get a job... catch 22. Basically, if you can teach your self what you want to do, how you learned it, doesn't matter to the employer, (this is what I learned after 30k in student loans)
2006-08-24 17:47:03
·
answer #10
·
answered by NNY 6
·
0⤊
0⤋