I hear a lot that colleges and universities indoctrinate the students with liberal ideology, because of the liberal professors. I just graduated from college, and very few of my professors ever talked about their politics, unless it was a class about the subject. I did get the feeling a lot of them were liberal, but I never felt like they were trying to change my point of view. I was just wondering what other people thought?
2007-03-19
12:05:11
·
3 answers
·
asked by
redguard572001
2
in
Education & Reference
➔ Higher Education (University +)