In my years of grade school and college, I noticed that the American educational system is very politically controled. For instance:
- Black history emphasizes on whites opression blacks, but hardly mention the many whites that fought and died for blacks.
- Woman's history emphasizes men who oppressed women. Yet, it hardly mentions the men who greatly helped women in their strugles. Think about it, if women couldn't vote, who gave them the right to? If women could work, who gave them their first jobs?
- Science teaches The Theory of Evolution, (which has never been proven, thus "theory") but will not teach the theory of Creationalism. At least, not with the same time and detail. It could be taught from a vaig, neutral point. Many believe in Creationalism, including Muslims, Christians, Catholics, and many others.
Has anyone else noticed this? Aren't these all discriminative?
I'm of Puerto Rican decent, and I've noticed this bias in our systems.
Thank you.
2006-06-16
07:42:03
·
11 answers
·
asked by
man_id_unknown
4
in
Gender Studies