Get your bachelor's. Higher education means higher opportunities. You're still young enough to balance studies and life, take advantage of it while you can. Why limit your opportunity?
2006-12-28 07:21:16
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
go ahead and get your bachelor's. Regardless that I personally believe you learn more being in the "real" working world than you do in school, a lot of jobs require a minimum of a bachelor's...even for entry level jobs. If you ever find yourself needing a job outside of the nursing field, you'll be thankful you have that piece of paper...no matter how useless it may be in reality (and believe me, it's pretty damn useless). I work in a corporate setting where pretty much everybody is doing a job that has absolutely nothing to do with what they studied for in school but the Human Resources requirements are still that they have a Bachelor's...even if you have a bachelor's in dead languages of the middle east or criminal justice...or worse - liberal arts.
2006-12-28 07:21:21
·
answer #2
·
answered by Lake 2
·
0⤊
0⤋
Get the Bachelor's. I'm not even sure you can become an RN without one anyways.
-Chris
2006-12-28 08:22:51
·
answer #3
·
answered by christopher_kitchens562 2
·
0⤊
0⤋
My understanding is that an bachelors degree in nursing is much preferable to a certificate from a nursing school. If you can, go for the degree.
2006-12-28 15:20:01
·
answer #4
·
answered by Ace Librarian 7
·
0⤊
0⤋
It is definitely worth getting a bachelor's degree in nursing b/c you will be able to further your career and earn more money.
2006-12-28 07:20:23
·
answer #5
·
answered by RainCloud 6
·
0⤊
0⤋
You should get your bachelors because if you want to move on further down the road you have some thing to build upon.
2006-12-28 07:15:01
·
answer #6
·
answered by 2007 5
·
0⤊
0⤋
More Education is always the better choice. At the very least, it will mature your mind and you will become more adult
2006-12-28 07:19:07
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋