English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Studies show that in the US, most people are illiterate about what most religions teach. To know anything about American history, for example, Judaism and Christianity are essential subject matter. Could it be that some of our problems with Islamic nations is because we don not understand the basics of that religion? Shouldn't the basics of these and other religions be taught?

2007-11-05 13:03:48 · 16 answers · asked by thundercatt9 7 in Society & Culture Religion & Spirituality

16 answers

I think the basics of the major religions around the world would be a good class for all students to take. It would help us to understand and maybe even appreciate someone else's point of view, respect their beliefs, "walk a mile in their shoes."

I can, however, see many parents object to their children being taught anything about other religions. They would probably fear that a conversion of some kind is trying to take place.

It would be a good class for knowledge, information and understanding.

2007-11-05 13:12:54 · answer #1 · answered by whimwinkle 3 · 3 1

yes. as long as you teach it in a completely non-biased way - teaching the facts, not trying to persuade anyone. Of course that's difficult, which is why most schools stay far away. But religion is a pretty basic part of the world, so it's not unreasonable to offer classes about the basics of the major religions. (that is, not the flying spaghetti monster. because no cultures or countries were ever shaped by the flying spaghetti monster.) it would be too much to have it as a required class, but to offer it to anyone who's interested in, say, the basic philosophy of Islam, is great. Time magazine wrote an excellent article on this subject a few months ago.

2007-11-05 18:09:37 · answer #2 · answered by sarah v 2 · 1 0

It's a good question, but a tough one. I mean, where do you draw the line about which religions you teach? Judaism, Christianity and Islam are all major world religions, but within each are important branches like Catholics, Lutherans, etc.

I think that religion can be referenced in American history and in that sense it is necessary. But I don't know that I think it's absolute necessity to have a keen understanding of the religion of the pilgrims, for example, so much as the reason they came to the "New World", to gain religious freedom.

2007-11-05 13:15:51 · answer #3 · answered by Miss Brown 4 · 2 1

Nah, better that we keep all religion the hell away from schools. If we teach the basics of some religions whilst rejecting others, this can lead to a biased outlook by the students. Whereas if we teach just one religion, they become brainwashed zombies like most Muslims and Christians. And if we taught the basics of all religions, that would eat into the time needed to teach more important subjects of real information like math, science, literature, history, geography etc.

2007-11-05 13:14:05 · answer #4 · answered by Anonymous · 1 1

Absolutely NOT! Which religion do you wish to impose on our children? Every family should have the right to choose which spiritual path they wish to teach to their children and not have to worry about some fanatic in the public education system telling their children that their wrong or making them uncomfortable because the school chooses to teach a particular brand of religion over another. If you want your child to learn religion in school, that's what private schools are for.

2007-11-11 17:02:20 · answer #5 · answered by c'est moi 1 · 0 0

I think this could make for a good class, however it would be too biased and would probably teach more Christianity than any other religion. If they could pull it off it could teach more tolerance of other cultures and other religions.

And if they bring prayer back to school, I am burning down every school in my city! Church and State are supposed to be separate, yet Christianity always seams to find it's way plaguing our government, swaying decisions.

2007-11-05 13:59:41 · answer #6 · answered by nin_tao 2 · 1 0

No. Religion has no place at all in public schools. That's what church-run private schools are for. How would you determine which religion to teach, and the plethora of sects within those when there are so many?

Tolerance of other cultures should be taught in schools, yes. But that is unrelated to religion.

2007-11-05 13:17:25 · answer #7 · answered by Anonymous · 0 1

Comparative religion classes are fine. Telling the factual story of the Mayflower Pilgrims and Roger Williams is fine. When I was in grade school, the cruelties of the Puritans was overlooked, brushed under the rug. It was only in high school that we learned via writers of that time just how sick the Puritan colony was and why it led to a secular government here.

2007-11-05 13:10:59 · answer #8 · answered by Anonymous · 2 0

NO!!!!!!!!!!!!!!!!! The constitution calls for a separation of church and state. No specific religion or set of religious beliefs should be taught in school. Just as dogmatic as people are that sexual training should be handled by parents, so also should religion or religions be taught by parents.

2007-11-06 03:47:26 · answer #9 · answered by darkdiva 6 · 0 0

First of all Christianity is NOT a religion. It has been thrown in with all the others and called religion. Religion was around long before Jesus Christ came on the scene due to mans natural nature to worship God, however a whole lot of other stuff was thrown in the mix, trees, cars, people, calfs, moon gods, star gods, animals and so on and God left out completley. A shame! This world is in the trouble it is in because again like it was in the beginning man thinks he's got it going on and don't need God. FOOL!!! One of the biggest mistakes ever American did was taking God out of the equation. I still recall the day that they took prayer out of school. Shame, shame, shame!

2007-11-05 13:27:17 · answer #10 · answered by Titus12 3 · 1 3

fedest.com, questions and answers