It seems to me, that at some point in our nations history, we shifted from the "Land of Opportunity" to the "Land of Entitlement." All too often I hear folks complain that the government "owes" them, or the government should be "helping." This line of thought irks me.
I am going to throw this last part in, because I am concerned that this question might transition to some kind of racial or gender issue. Race or gender has nothing to do with my above thoughts. There are both women and men, white folks, african american folks, latino folks and on and on, who have have been proactive and acted as though America is a land of opportunity and made a great life for themselves. And there are all of the mentioned groups who have done the opposite; threw their lives away and yelled at the government for not helping.
(And please, I am aware that Hurricane Katrina was a debacle that didn't make this question easy to ask)
What are your thoughts?
2006-06-30
01:28:05
·
11 answers
·
asked by
Bruce B
4
in
Government