It really seems odd to me that white women will allow hollywood to portray them as trashy whores that can't keep their panties on. I was tugged in to see : The devil wears prada and it had the same ole storyline, white women have no morals while other ethnicities are pure and wholesome, this isn't a racial question it's a morals question, do white women not realize what image hollywood is assigning to them? an women still pay money to be portrayed as slu'ts. I have a real problem with the way hollywood portrays white women, for a long time , the blacks were portrayed as thugs and gangsters with no morals, until it became un-fashionable to continue that storyline, now it's whites that are being attacked. I don't understand why they must always run people down to make a movie. am I the only one who notices this cycle?
2006-09-02
10:16:03
·
9 answers
·
asked by
Anonymous
in
Other - Society & Culture