I really don't understand this! If whites are simply 'American' then why are blacks called 'African-Americans,' and Indiginous peoples 'Native-Americans?' It's like the government, and other stupid people in our society, in an attempt to sound like our 'compassionate friends,' have inadvertently uncovered their inherent racial prejudices. If 'Native' Americans are indiginous to the land then shouldn't we simply call them Americans? If we are to refer to black people as 'African' Americans then shouldn't whites be referred to as 'European' Americans? I think it's racist to call blacks 'African.' In reality, the entire genealogy of our species stems from Africa. Shouldn't we, as civilized people, recognize the law of the land that states that anyone born on U.S. soil is simply 'American?' I know the U.S. is about assimilation, but to end racial/cultural static shouldn't we empower EVERY citizen with a strong 'American' culture? (Not MTV/Pop trash culture)...
2007-02-26
10:30:51
·
16 answers
·
asked by
Anonymous