They don't pay you for what you are really worth in sociology, monetarily and not to mention they downplay social workers...they think that soc. work is the only thing you can do w/ the sociology major...and they say the soc. major never pays off... but the real question is why do they give sociology a title, the respect to become a major in sociology, if not maybe not no more than a minor, if they realize that sociology majors will never make real money like the lawyers, doctors, or businessman? Everyone goes to college so they can make more money than the average person who doesn't go to college and just decides to go directly out of high school and into the work world, so they can make more on average from non-college graduates...why would they sabotage a young person's vulnerability, and easily impressed nature, and lead them into the path of deception knowing that they would never make the money equal for the work they do? Why would they therefore ruin a person's entire life?
2006-08-20
07:38:26
·
9 answers
·
asked by
sweetcream252006
1
in
Sociology