hes putting himself above others
2006-10-23 08:57:12
·
answer #1
·
answered by drummer boy 2
·
0⤊
1⤋
Each particular segment of the college population has its own biases. As an English major, I'm more inclined to think that majors such as English, history, psychology, etc. are the best because they are more academically based and require more subjective and critical thinking. As a group, we tend to look down on business majors because it seems the majority of them are majoring in business because they don't know what to do with their lives. As liberal arts people, we know we won't ever be rich, but that's okay, because we do what we love. And I would much rather be a poor English major than a rich business major any day. Business majors can lead to great things, but mostly, I think it leads to a daily grind in a cubicle or a purgatorial middle management position is some bloodless company.
2006-10-23 13:39:22
·
answer #2
·
answered by Evelyn's Mommy 5
·
3⤊
0⤋
Business degrees can lead to a lot of other things like MBAs or Law Degrees. So, no, it actually is a real degree.
However, when I was in school, most of the Business Majors I knew were shallow, superficial tools who could barely spell Business. So I see where the author was coming from.
2006-10-23 08:54:05
·
answer #3
·
answered by Who_Dey_Baby? 3
·
4⤊
0⤋
We considered business majors to be a real major. As a science major, we considered the Liberal arts and arts to be a joke, but not business.
2006-10-23 12:51:19
·
answer #4
·
answered by Lea 7
·
0⤊
2⤋
The person who said that probably majored in philosophy or art or some other bullsh*t and therefore does not have a "real" job.
2006-10-23 08:58:28
·
answer #5
·
answered by Anthony S 2
·
0⤊
3⤋
What does "real" mean anyway. I wouldn't get worked up over it.
2006-10-23 08:55:42
·
answer #6
·
answered by retorik75 5
·
0⤊
1⤋
He's an idiot!
2006-10-23 09:45:32
·
answer #7
·
answered by sunshine 4
·
0⤊
1⤋