English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

4 answers

Yes, it is a fantastic idea. It gives you experience in the field and can look great on the resume. Besides, a lot of employers are interested in experienced individuals unless you go through the career services office or job fairs at the college. Also, if you are interested in working at a job while in college, the best thing you could do for your career is to work in the field in some capacity.

2006-11-09 16:52:29 · answer #1 · answered by dawncs 7 · 0 0

Ya , definitely. it helps a lot. The first major outcome is that you come out of your textbooks and see for yourself , how an industry works. The textbooks and courses give you theoretical knowledge, which are necessary. But when you enter the "Real World" you can find that a majority what you studied may not be used in your work. Also it can help you choose your field of specialisation, add value to your resume and makes you more employable. So go ahead grab the next opportunity !!!

2006-11-09 16:54:40 · answer #2 · answered by ar.karthikkn 1 · 0 0

Yes. It's also wise to consider doing an internship so that you can see what the career is like. If you don't like it, then you still might have a chance to change your major or career path, if that's what you want to do. Plus, paid internships can help you to pay for all of those college expenses.

2006-11-09 16:39:39 · answer #3 · answered by Persephone 6 · 0 0

yes, because when you get out you can hit the job market with good experiance under your belt........

2006-11-09 16:33:29 · answer #4 · answered by Anonymous · 0 0

fedest.com, questions and answers