Lately, I've been examining a disturbing pattern taking flight; I'm curious if no one else has noticed. No offense, of course, but it seems that white America has taken to adopting black males as the latest accessory, the new handbag, or the latest Prada, perhaps. While I possess no abundance in years, I can remember not a long while ago, the "idea" of interracial coupling was considered an "issue" on after school specials. Now, suddenly, black males are the friendly new Ken?? Not to paint this as problematic, personally, I enjoy that society has taken such a leap, but under what basis? Simply because black males are "interesting" and "exotic," or white women "fun" and "different" or because boundaries are actually being transcended and progress actually being made? In which case, what role do black women play in this new trend? (Please be mindful not to clump African American females and biracial females into the same racial category as there is a difference) What is your perspective?
2007-01-28
08:57:00
·
13 answers
·
asked by
evelynn waugh
2