I attend a small liberal arts college in South Carolina. I am an Education major and have noticed that most of the Professors go to church and they talk about religion in the classroom constantly. They assume that all education majors attend church and they make comments about how they like to see the Ten Commandments hung up in a school. I feel that this is ridiculous. I was taking a course on the child, families, and communities and all the professor and the other students talked about was religion (Christianity) and how they can judge a school when they first walk in it. Some even suggested that Title I schools were "bad" schools and anything above that was a "good" school. I feel very strongly about this but was afraid to speak up for fear of being ostracized. What do I do? They are warping the minds of future teachers.
2007-01-20
02:37:41
·
4 answers
·
asked by
tigerlily23
3
in
Social Science
➔ Other - Social Science