I feel there has been a huge negative change in our society. 10 years ago young women were considered strong, bright, opinionated, and respectable. Today it seems that a majority of young women only care about appearences, celebrities, they are way too ditzy, and dress provocatively. It also seemed as if people were finally starting to really really accept the "black" culture. Now, the term "******" is used way too easily, in a deragatory sense, and no one wants to correct people when they say it. Not to mention the new season of "Survivor" where the show is dividing up teams by race. Didn't our ancestors fight for something?
2006-09-11
15:16:43
·
3 answers
·
asked by
mtneerchic01
2