Is it just me, or does it seem like our country is growing and growing into a sadistic, uncaring place? I mean, do you agree when I say that Americans are becoming more and more selfish and co-dependent on other people? *I'm not trying to be racist or anything, I am American myself* Do Americans take the simple things in life for granted such as having the ability to walk down the street unharmed or visit a grocery store with unlimited choice?
2006-09-19
09:37:19
·
11 answers
·
asked by
Anonymous
in
Other - Arts & Humanities