Why do people think that the more money that you make, the happier you will be? I don't understand this.....people make money so they can be happy, right? But if you're already happy, then making a lot of money isn't really that important is it? My mom always tells me that "things will all come together" once I get a high paying job, but I'm not so sure. I don't think dollar bills can cure a broken heart, or a hopeless feeling....can they? I mean, sure you will be able to buy more things, but it's the core of the person that counts, right?
2007-04-13
10:46:12
·
18 answers
·
asked by
beachgirl
1
in
Careers & Employment