Not really a question, but an observation, I think reality shows are so anti-women and against feminism, in their cruel and often superficial depiction of women. Many would argue that men are explited also, but men are not the ones who have had to fight for the right to vote, hold down a job, be taken seriously, run for congress.....I would really love to see reality TV just disappear. I am sick of seeing women being degraded and dehumanized on a weekly basis. Men can be dehumanized and exploited till the end of time for all I care, but stop degrading the far superior female species!!! Feel free to comment on this.
2006-10-17
04:11:31
·
32 answers
·
asked by
Jaime P
1
in
Television