I'm a college grad, born and raised in the U.S. This topic was brought up by a professor I had.
He said American women have been raised to lie by omission (it was origanlly a southern female trait, now recognised as an American one) I feel like this has expanded from just lieing by omission to general dishonesty. The wierd thing is it also seems like American women would occasionally prefer to be lied to as well.
Before i'm bombarded by angy women, let me explain. Lieing by omission is basicly failing to inform someone, and it's not just women who do it. It is however, a trait that's much more prevelnt in American women than any where else in the world.(including American men)
"Little white lies" Lies to spare someones feelings, or just lieing to keep what you have are unacceptable to most people of the world....why is it that it's acceptable for women to desive in America?
.
2007-08-06
12:41:38
·
29 answers
·
asked by
Super P
1