I'm a black Teacher and I've been very, very nice to the other Teachers--mostly white women. I've volunteered to help them on projects and always greet them with a smile. But I recently found out that many of those white women were going behind my back and reporting every mistake I made-- and even lied to the Principal about me--which landed me in trouble (all the white these same women made 100 times more mistakes than me).
Its shocking because these same women smiled in my face and pretended to be my friend --but were stabbing me in the back !! They give off this nice image to the world--but they act like devils behind closed doors...
So, I just wonder: Is it a part of white culture to be very fake on the job--while stabbing people in the back? I think this is a common problem on many job sites.
Serious answers only. Thanks!!
I have white people in my family whom I love--so please dont think I dont like white people.
2007-03-15
11:22:54
·
5 answers
·
asked by
Anonymous
in
Business & Finance
➔ Careers & Employment