I go to a supposedly great college. Granted, I take night courses, but I just don't feel as though I'm benefiting from it in the least. The only thing I know I'll get out of it is a very pricey degree, yet I hoped a little knowledge would go along with it. I haven't learned one thing I didn't already know and didn't care about. Its as though contemporary college (I've been to three universities) is a business wanting to sell you the most prestigious piece of paper, as if education is the farthest thing from their mind. Or perhaps theres some ulterior motive involved, like college is there so people can get pushed into jobs after they graduate in order to pay for it, thus making it a tool of the economy to keep us in line and working for them. Colleges are sheep factories to give graduates some false sense of accomplishment so they might feel better about getting and settling for a career. Originally I believed it might help me define my goals and better educate me in many areas. wtf?
2007-10-30
18:50:11
·
4 answers
·
asked by
Anonymous
in
Higher Education (University +)