English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

What I don't understand is why the government is trying to take Christianity out of our schools? If America is founded on the beliefs of the Bible "Under God", not Yaweh or Allah, why is the government trying to remove Christian values out of the schools? I don't want to hear that Yaweh, Allah and God are all the same God b/c in the Torah and Koran the god seems to be evil, and full of hatred. America is slowly going down hill; soon this whole country will be believing in false religions. Its ashame that Islam is rapidly gaining more and more members, surpassing Christianity. Can someone please share with me their views and opinions on this subject.

2007-06-19 03:58:33 · 51 answers · asked by Anonymous in Society & Culture Religion & Spirituality

I understand what everyone is saying; America is a free country so why should one religion be taught and not another. The problem is that if America wasn't a Christian or Godly nation, why does our currency contain "In God we Trust", why do Americans be the ones trying to elect a Christian official into Office. It's because we are a nation based on God and Christian values. All I'm trying to say is that America is becoming a more secular nation, and anyone can look around and see the outcomes of this; this country is not looking good. Also, I don't want to hear anymore sarcastic remarks, all I wan't is your opinion because I'm not debating with anyone.

2007-06-19 04:22:39 · update #1

Ok, so I understand that the country wasn't founded on Christianity, but please don't give me "Christianity in schools.... 'In God We Trust'" wasn't added until the 1950's." America must have seen something wrong in society in decided to add that to the law. Just because it was added later does not disregard it. That would mean that the whole "Marriage Amendment" or "Legal selling of Alcohol" would have to be disregarded also.

2007-06-19 04:42:57 · update #2

51 answers

Because it promotes inequality and biased teaching for students who come from very mixed backgrounds and beliefs (in public schools)?

Islam is not the only other religious belief in the world. I really can't understand why Christians always feel the need to come down on that in particular. It's idiotic in my opinion to assume that all students should learn the same beliefs when people in general are different.

Oh and another thing. Schools are supposed to teach their students facts. Teaching something that not everyone believes, nor that has any proof would be violating their most basic principles.

2007-06-19 04:01:51 · answer #1 · answered by Mystery Lady H 5 · 11 2

I'm a teacher and I can tell you that it's not just Christians going to school. How would you feel if your religion was Islamic or Jewish and everyone else was practicing Christianity in the classroom? You'd be excluded and you'd feel very uncomfortable. There is a really good reason why we shouldn't have religion in public schools. That is why there are private schools--so that people that wish to have religion in their education can do so. The private school in our area has it where the tuition is deducted from your church donation(what you give as an offering is the tuition.) My child has been in both private and public schools, and yes, we do miss the religious aspect of private schools, but the public schools tend to be better equipped (more teachers, etc.) The reason why we don't have religion in public schools is that public schools is paid for by taxpayers' money. Therefore, people who are not necessarily Christian is paying for it. Also, it then becomes part of the "separation of church and state" because it is tax money that pays for the schools. When America was founded, they didn't want people to be persecuted for their religion. That even includes those that don't have a religion. America is based on the ideal that people will be free--which includes the freedom to choose what religion you are and the ability to express your ideas.

2007-06-19 04:06:08 · answer #2 · answered by April W 5 · 2 0

The Founding Fathers were very careful NOT to give any particular religion more prominence than any other--that was the purpose of the non-establishment clause, the first clause of the first amendment: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof..."

First we need to understand the meaning of the word "respecting" in this context. In modern language, it simply means "about," and the intent was to prevent Congress from establishing a State Religion because of the problems that practice had created in Europe during the preceding centuries. All but the strictest of constructionists agree that this prohibition also extends to State and local governments. Since public schools are part of local governments, those governments cannot use the schools to promote any particular religion or religion in general.

Many schools, though, have been erring on the side of caution and denying religious groups and students the right to free speech which is the second clause of that same amendment, as well as denying them equal access to government facilities, schools included. Check out the Site of the Civil Rights Division of the Department of Justice to look at its enforcement activities to protect religious rights:

http://www.usdoj.gov/crt/crt-home.html

2007-06-19 04:26:58 · answer #3 · answered by nightserf 5 · 0 0

Because Christianity was never supposed to be IN our schools. And, if you are so ignorant of your own beliefs that you don't realise Allah, Yahweh and God are all the same since all 3 holy books are originally the same (read the Koran, the Torah and the Old Testament) then you probably didn't read the first amendment to the Constitution of the United States. By the way, "under God" was added to the Pledge of Allegiance in the 1950's, when McCarthy was looking for Communists under every rock and at every kitchen table. Apparently, to him, Godlessness was a part of communism.

Separation of Church and State was a primary tenet of this country's founding. If you don't like that, go form your own country where everyone must be a like-minded drone to your deity.

2007-06-19 04:05:21 · answer #4 · answered by mikalina 4 · 4 0

In this day and age, cultural diffusion is bringing more and more different people together. Back when our nation was first being formed, around 99% of America was Christian, aside from the Native Americans. Today, people are moving to America from all over the world, and Jews and Muslims believe that they deserve to obtain an education that is not influenced by religons they view to be false. There is also a very large number of Americans who are becoming Atheists, further decreasing the power of Christianity in America. Schools are attempting to hide the Christian beliefs that have been part of American education for centuries because religous minorities are asserting their rights as Americans to live in a nation where church is seperated from state (education). Don't they deserve the same rights as Christian Americans?

2007-06-19 04:08:35 · answer #5 · answered by Anonymous · 2 0

Christianity is a religion, this is supposed to be a non-religious nation. Teaching Christianity in public schools hinders religious freedom. If you want your children to have a Christian education send them to a private school.

The nation was not founded under God. That phrase wasn't even used by the founding fathers, it was added to the currency much later. They were rationalists and deists, seeking to establish a country where people could believe and worship how they wanted.

The Declaration of Independence was influenced by Enlightenment thinking, not religion. There are whole sections borrowed from John Locke, not the Bible. This isn't a Christian nation and really never has been.

2007-06-19 04:03:20 · answer #6 · answered by Anonymous · 9 0

Christianity never was TAUGHT in our schools. As someone who started school in a small Illinois town in 1957, I can tell you that I was never taught anything about christianity in school. Yes, there was a little bit of it implicit in music, for example, at Christmas. Yes, we did say the pledge of allegiance with "under god"- just who that god was or what he signified was never discussed, and no one could possibly have an informed opinion about him based on their public school experience. It is not, and never was, the public school's responsibility to teach people about religion. That's the responsibility of religious denominations. Our society has always been a secular society. People have never been forced to attend church from the beginning of the republic. The founding fathers saw that as a good thing. So do I.
As far as the god of the Torah and Koran being evil, and the biblical not, well, read your bible. There's plenty of evil and hatred in the name of religion, particularly in the Old Testament, and christians of every stripe, including the KKK are still using it to feed their prejudices.

2007-06-19 04:17:28 · answer #7 · answered by gehme 5 · 1 0

Public schools are funded by taxpayer dollars through state and local governments with federal assistance. They are a government institution. The Constitution of the United States, in the first amendment, states "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof."

It is therefore unconstitutional for the government to support one religion over another. This is what is meant by the separation of church and state. It has been further stated by Thomas Jefferson that "no man shall be compelled to frequent or support any religious worship, place, or ministry whatsoever, nor shall be enforced, restrained, molested, or burthened in his body or goods, nor shall otherwise suffer on account of his religious opinions or belief; but that all men shall be free to profess, and by argument to maintain, their opinion in matters of religion, and that the same shall in no wise diminish enlarge, or affect their civil capacities."

To integrate religious teachings, regardless of which religion, into the public school system is to violate this separation of church and state. Remember that many early settlers of the American colonies left England to escape religious persecution because they held to different beliefs from the Church of England, which was the government mandated and backed religion. To insert Christianity into the schools would be to select and adopt a specific doctrine within the Christian faith (Protestant, Presbyterian, Catholic, etc) and suddenly there would be a state sponsored religion. This would tear down the very basic tenants of the first amendment.

Removal or exclusion of Christianity from the schools is not the same as insertion of Judaism, Buddhism, or any other world religion. It is simply the education of children in those subjects that the schools teach, leaving religious teaching to the parents and clergy.

2007-06-19 04:13:05 · answer #8 · answered by Becka Gal 5 · 1 0

You're definitely ignorant of what the constitution of the United States actually says. The United States is not founded on any religion. The Word God never appears anywhere in the constitution and the word religion only appears in the 2nd amendment where it say the government shall not establish a religion. The United States was intended to be a purely secular government with the freedom to practice what ever religion you wanted too, or not practice if that was your choice. So stop listening to the distorted version of history being preached in your church and become a real American and read the constitution.

2007-06-19 04:01:39 · answer #9 · answered by ? 6 · 9 0

The Phrase "Under God" was not on currency or even in the pledge of allegiance until the wave of McCarthyism hit the nation.

This country was not founded on the beliefs of the bible. Sorry to burst your bubble but the founding fathers where Deist. The nation was founded on an idea of equality for all men and freedoms of speech, religion and freedom from oppression.

2007-06-19 04:05:59 · answer #10 · answered by John C 6 · 2 0

Its pretty simple, In this country we have Freedom of religion, and to teach only Christianity in public schools would be a violation of this right. So considering we do not want to have to pay more taxes so the Feds can teach all religions in school they have decided to not teach any religion in schools. Society has a place where you can go and learn all about Christianity, its called church. Schools are for learning Math, Science, History, and English... If you would like for your kids to learn about Christianity while they are at school, there are many fine christian schools across the country.

2007-06-19 04:08:49 · answer #11 · answered by Anonymous · 2 0

fedest.com, questions and answers