It does not seem to me that there is any sort of honest American nationalism any more. Everyone, it seems to me, thinks of them selves as some sub-faction withing the nation, but not as a member of the nation as a whole.
Examples:
Hispanic immigrants do not think of themselves as Americans, with 99% of them, they always say they are mexican, not American, even if their family has been here or generations.
Texans will refer to themselves and being Texans, even if you meet one outside of the US! New Yorkers will tell you they are from NY, not the USA. If you travel, ask and see what people say!
I personally live in California, and I feel that I have nothing more in common with someone from Louisiana than I do with someone from Australia. Isn't this how everyone feels?
I think most people in the US are focused within, and identify with their own race, region, personal interest or other sub-grouping than with being an "American".
Thougts?
2007-01-17
08:10:13
·
6 answers
·
asked by
Anonymous