Remember movies like Menace To Society, and Boyz In Tha Hood? I am sure you remember movies like that. Movies that always portrayed black men and women in a negative light. They were always shooting one another or the women were sleeping around. But now movies like that have faded away. We now have movies and shows that show our successful black men with white women. Or our beautiful black women with white men. Do you think that the media plays a big role in the way that black ppl treat one another and even in relationships? For instance in the movie Monster's Ball (Black men would you have liked to see Halle Berry have sex with someone other than a white man) or black women when ever turn on the television and they show a white woman kissing a black man, does that raise a hair on your head? why do you feel there are not as much positive interactions between black men and women. And why do you think they continually show movies and shows like this? And why do we constantly watch them?
2006-08-24
20:10:27
·
7 answers
·
asked by
Tonya
3
in
Family & Relationships
➔ Singles & Dating