Hello! I'm considering going back to school to pursue an associates in nursing to be an RN (or at least start off as an LPN/LVN). I am interested in nursing to 1) help people and 2) to have a stable career. But I have done some searching online, and have found quite a few current nurses who hate their careers. Many of these ladies (and men) site cattiness and backstabbing at work, verbally abusive doctors and patients, and no support. As a male, I realize that I would be going into a female-dominant field, so I would expect some friction there. But this abusive behavior seems to be spread across the board! Such a negative work environment would make this career unbearable. What are your expereinces with this? I know it is a high stress field, but that does not excuse the supposed lack of professional respect. Here is the site I found these opinions at for reference:
www.aboutmyjob.com
What are your honest opinions about today's nursing field as a job?
2006-10-10
18:35:49
·
5 answers
·
asked by
atomicfrog81
3
in
Business & Finance
➔ Careers & Employment