English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

In the declaration of independence, it is written that "We hold these truths to be self-evident" SELF EVIDENT means that the morals of this country are not derived from any religious text, but rather from common sense. The god that is referred to in the declartion is never singled out as the Judeo-Christian god, but seems to be a Deist abstraction, since it is reffered to as "The God of Nature". Therefore, our morals are not Christian.

Article 11 of the treaty of Tripoli states that "The Government of the United States of America is not, in any sense, founded on the Christian religion". This treaty was sent to the Senate in 1797, where it was read to every Senator, and was unanimously ratified by all 23.

The phrases "Under God" in the Pledge of Allegiance and "In God we Trust" on our bills were added in the 1950's, during the height of McCarthyism, in order to set our country apart from Atheist Russia.

So where does this idea of a Christian nation come from?

2007-08-13 04:01:19 · 19 answers · asked by Shinkirou Hasukage 6 in Politics & Government Politics

The original pledge was written by a Baptist Minister named Francis Bellamy and went as follows:

"I pledge allegiance to my Flag and to the Republic for which it stands, one nation indivisible, with liberty and justice for all"

2007-08-13 04:01:38 · update #1

Thomas Jefferson was a self-described deist, and Benjamin Franklin once said that lighthouses were more useful than churches, so it looks as if those two weren't Christians...

2007-08-13 04:10:20 · update #2

Lurchleft, the fact that many of the founders were devout Christians doesn't mean anything, since they obviously did not want America to become a theocracy. If they were trying to create a theocracy thay never would have included the establishment clause in the constitution...

2007-08-13 04:13:18 · update #3

Wow, what makes anyone believe that I want to repress anyone? I don't want to stop anyone from practicing their faith, I just want people to realize that those who say that this is a Christian nation are mistaken...

2007-08-13 04:17:42 · update #4

19 answers

It wasn't... unfortunately, a lot of people like to revise history and tweak it to suit their needs.

There were a majority of legislators that were Christian but none of them (that I can think of) wanted to saturate the new republic with Christian dogma and they had the foresight to protect other religions or beliefs from the government.

You are right about the Deist movement at that time and it was very popular with intellectuals (and our founding fathers were intellectuals). There was also a very strong element of Quakerism in colonial life and this religion had a very liberal interpretation of the Christian scriptures and concept of God.

Thanks for writing this question. I am always angered by how people twist our founding fathers and make them out as religious zealots. They were far from it... that doesn't mean that they did not believe in something though. But in general they did not think God guided everyday life. At the same time, they were Western men and society for 100s of years were Christian in nature... you can't discount the influence of Judeo-Christian philosophies (even if it was subconscious) on the founding fathers.

2007-08-13 04:14:35 · answer #1 · answered by cattledog 7 · 2 5

Well, it's sort of a thin line don't you think? Times were VERY different when this section of earth was founded. And to claim that this country was NOT founded on Christianity would either mean you KNOW the founders or have proof to the contrary. The fact that God is included in our laws, is the very representation that America is mostly a deist country. But the free will is what is the most important. I would like to think that America has been blessed as much as it has because we are making progress towards HIS divine will. I do believe we need to make changes, environmentally & financially, but I believe our intentions (in general) are to be good. Christianity only means that you believe in a God. America certainly believes in something. God bless all our worlds.:o]

2007-08-13 11:34:01 · answer #2 · answered by Anonymous · 0 0

I am 100% certain the nation was founded by Christians. They also seem to have had foresight enough to ensure that they did not create a theocracy. They left the references to god generic such that other people of faith might accept them as their own. I'm not sure who you think is making the claim that we are a Christian nation. There are of course some nut jobs on every side of every issue but by and large most people accept that the nation was set up using judao christian values as an original basis for law but was intentionally not established as a theocracy.
It is reasonable to call America a "nation of Christians" in that the majority of Americans ( 65%) indentify themselves as Christians. The numbers were much higher in the past.
The declaration paraphrases John Locke so maybe you can check his faith.
And I'm not a Christian but I can figure this one out. Is it that upsetting to you? Keep reading any history book should get you really pissed off.

2007-08-13 11:37:01 · answer #3 · answered by joshbl74 5 · 1 0

It is true that the foundation of the United States was founded on Christianity, simply put because every colony that was founded, was founded as a Christian colony, and most were founded for the religious freedom of various Christian sects, be they catholics, puritains, baptists, or what not. Everything from our constitution to our legal system is founded on the judeo-christian tradition. However, it is equally as important to realize the sources that your are citing were attempts by the founding fathers to protect the rights of those of a religious minority which is why a specific God is not mentioned and why there is no official religion.

Also, just a note, from a purely legal persepective, the Treaty of Tripoli deals with international recognition as opposed to national. It is a very small legal difference but basically it govern's the US on international terms as opposed to national ones. Though those principles have since been incorporated into US law subsequently.

2007-08-13 11:23:47 · answer #4 · answered by Erica 3 · 2 1

The idea of a Christian nation comes from the fact that the vast majority of Americans have always been Christian. There has never been a nonChristian President. The Constitution and the Bill of Rights were written by men who were all Christians. What's the problem?
If you don't want to be a Christian, don't be. But you people who claim to be liberal, but have no tolerance for the religious beliefs of others are nothing more than pretenders. Intolerance is not liberal. Not even if that intolerance is directed toward Christians.

2007-08-13 11:13:58 · answer #5 · answered by Crystal Blue Persuasion 5 · 5 1

Because most, if not all of our founding fathers where men of faith. not all of them were Christian, but all religions employ the same types of myths to teach common values that all people share regardless of religion or culture.
As human being we all value the same type of myths regardless of our values:
knight hero myth
the martyre myth
the accension myth etc.
all religions have them, just different stories behind them.
So even if our founding fathers were not Christian, their views of what is right and what is wrong, in regards to preserving freedom for all, would have been pretty much the same.
America was founded on their values which were derived from a majority of them being Christian, but that doesn't mean those values are excusively Christian in nature. Nor does it mean they intended us to be having arguments on whether or not our government should be used to force any faith based issues on everyone, as the extreme right-wingers of today fail to understand.
Even though many quotes can be found to claim they were Christians, just as many can be found to show they abhorred the idea of a religious based government, that the extreme right of today, wish to promote.

2007-08-13 11:30:59 · answer #6 · answered by Boss H 7 · 0 0

It doesn't matter.

We can all argue back and forth all we want about this. But the fact is that the First Amendment clearly states that the government may not hinder religious freedom. How we were founded doesn't change that. Calling us a Christian nation doesn't change that. It has no legal meaning whatsoever. Let them call it a Christian nation founded on Christianity all they want. It doesn't change anything.

2007-08-13 11:35:45 · answer #7 · answered by Anonymous · 0 0

could it possibly be the principals. surely "GOD" does not refer to buddah, muhhammad or zues, does it? besides, what difference does it make? the idea of a christain nation may come from the fact that most people at that time were christains in the soon-to-be USA. besides, what are the alternatives;atheist? what a bunch of loons. the"i don't believe in GOD so no one else can, even though most people in america do" crackpots. they neglect the needs of the many for the wants of a few. not very patriotic i would say. or is that the a.c.l.u. doesn't matter , all loons i say.

2007-08-13 11:41:33 · answer #8 · answered by BRYAN H 5 · 0 1

The founder's of this fine nation were Christians...they believed as other Christians do which is it is your choice to follow whichever religion you think best...even if they believed theirs was best...but there is no denying that this country was based on Christian principles....can't be helped if the founder's were Christian. You'll just have to live with it.

2007-08-13 11:12:20 · answer #9 · answered by Erinyes 6 · 4 2

I don't think the question is if the government was founded on Christianity so much as the Founding Fathers were Christians. Even "under God" and "In God we trust" don't specify a specific God.

2007-08-13 11:07:35 · answer #10 · answered by Brian 7 · 12 2

fedest.com, questions and answers