Most white people aren't as evil as other races make us out to be. There are deranged and evil people in all races. So, why are white people always targeted as being evil and racist and the root of all problems in the world? I don't fit into any of those categories, and I'm tired of people saying it. For instance, slavery. I wasn't around when it happened, and if I was I would've been abolitionist anyway. Not all of us would've supported slavery or the bigotry of the "old south." It feels like people expect an apology for something I had no control over. I feel horrible about how something like that could happen, but I can't change it. I'm not evil and a lot of other white people aren't evil either. So, why do other races make us feel like the scum of the Earth?
2006-07-18
16:05:14
·
9 answers
·
asked by
Sharlize De La Croix
1