I obviously don't mean all americans, but the feeling of antagonism toward anything not american is on the up in America today. This is not the america I have known for years before I moved out in 2000.
Only since last year have I heard of soccer-haters, un-american, 'we don't care what the world says' in America. This is not to say Americans should't be proud to say they are Americans, but many are showing this pride in how different they are and better than non-Americans. Seems nothing is wrong with that, but not so to me if you accept the special place of America in the world - as a leader, land of hope and love.
Do you have a feeling some people are about fanning the flames of nationalism (may be 'nationalcentricism' if the word exists) in America? Nationalcentricism is a clear sign a country has lost a bit, if not more, of what used to make it great.
I don't know but I knew a different America.
2006-07-06
10:10:39
·
12 answers
·
asked by
Fontonfrom
2