do women think that men are just actors who try to win their heart and have all the fun they want, but really they don't care about them deep inside their heart or want to stand by them through times of adversity?
If that is the case, they should not be blaming men entirely for that - in my opinion, shouldn't they be blaming the western culture itself for that? It is western culture which lays emphasis on the ego - i.e. do whatever you want or can to make yourself happy.
What are your thoughts on this?
2007-02-12
16:17:49
·
13 answers
·
asked by
Anonymous
in
Philosophy