Maybe some of the founding fathers followed Christianity in some form or another, but the main architects of the constitution and the bill of rights were definatlely not Christians.
Thomas Jefferson was a Deist. When he wrote about god he mean the Deist view of god = nature, not Jehovah or Jesus. He even wrote his own version of the gospels that removed all reference to theism and left only the moral teaching of Jesus.
Benjamin Franklin was a Deist. His friend Thomas Paine was an ardent opponent of the bible. George Washington was considered a Deist by his own Christian Pastor. (He was a member of the Christian church in Virginia because membership was MANDATORY in colonial days).
The first six presidents of this country were Deists or Unitarians. John Adams and Congress enacted the treaty of Tripoli that said in plain English that America was not a Christian nation.
So why do so many people use the fallacious statement that "This country was founded Christian."??
2006-07-18
05:48:49
·
24 answers
·
asked by
Eldritch
5
in
Society & Culture
➔ Religion & Spirituality
Some points I'll try to address:
The pilgrims didn't establish the constitution, even the colonial charters were not the driving force behind it. People like Franklin and Jefferson, children of the Enlightenment, essentially created our country. They made America a different type of nation than any other on earth at the time. They didn't do this by following the beliefs of the puritans.
2006-07-18
06:04:46 ·
update #1
Oh, and just to clarify...
By America I do mean the country formed by the 13 English colonies. I'm not taking into account the Native American era of dominance on the continent, or the other European territories.
2006-07-18
06:08:34 ·
update #2
Most of them don't even read up on the history of their own religion and you want them to read the history of the nation they live in? Sheesh...
Seriously, that has a lot to do with it. And if they ignore history, then they can replace it with whatever they want... such as American being founded as a Christian Nation. Surprisingly, though, many who say this (and will argue the point), will also claim they think all Freemasons follow the devil... Now, point out how many of the Presidents have been masons... hehe
2006-07-18 09:29:03
·
answer #1
·
answered by Kithy 6
·
1⤊
0⤋
Probably because the same ignorant people don't believe in global warming, evolution of the species, abortion to rid the world of unwanted babies and all the other myriad of things the drooling religious right believes in.
You have you facts a bit skewed, though. America was discovered by the Spanish who colonized the Southwest in the name of Geezus, long before that miserable band of religious mis-fits got kicked out of England and showed up on Plymouth rock and started making beer.
The founding fathers realize the power and ignorance of the religious right and how official state religions had nearly destroyed Europe, so they took great pains when the drafted the constitution so as to make it impossible for the drooling religious right to usurp the constitutional government and install Geezus as the head of the country.
They did a pretty good job of it until Reagan got elected and then he not only showed big business how to break the unions but showed the drooling religious right how it could take over the country and shove their ignorant, narrow minded beliefs down the throats of the majority...
2006-07-18 06:02:28
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
I don't think it was. Because if it was then why did so many of the founding fathers support slavery and hatred. That is not Christian at all and it makes me mad because they didn't care about the slaves. They took land from the Native Americans and pushed them all over the United States and finally to the mid-west on reservations with the worst land possible. That was not a Christian nation. However, there were a small group of people that were Christian like Harriet Beecher Stowe who wrote Uncle Tom's Cabin. I admire her because she was a Christian in a non-Christian nation.
2006-07-18 05:57:27
·
answer #3
·
answered by Jolisa 2
·
0⤊
0⤋
You know maybe no one really saw what religion or faith they were following, maybe the poeple from then and now see the fact that there are many references to "God" which is important.
I mean i have notices stuff like Credit issues. After 7 years your debts is pardoned. that is in the bible, it is said to forgive all your debtors and forget about it if in seven years they have not paid you.
Another one is the phrase, being a "good Samaritan" which everyone knows is one of Jesus' Parables. So maybe that is what people refer to when they say this country was founded Christian. I'm sure there are many other Biblical views put out in the Governmnet that I don't know about.
2006-07-18 05:56:05
·
answer #4
·
answered by Marillita 3
·
0⤊
0⤋
That's a fair point. There are some who might argue that deism is not necessarily as incompatible with Christianity as you make it out to be. There's no doubt though that men like Paine and Jefferson would not be considered mainstream Christians today. However it's obvious that Jefferson believed in Judeo-Christian principles, othewise he wouldn't have bothered with creating a Bible he thought reflected Christ's true teachings.
2006-07-18 05:55:11
·
answer #5
·
answered by michinoku2001 7
·
0⤊
0⤋
because, the pilgrims came to america with only faith and a bible.
benjamin franklin did not "find" america... and neither did any of the early presidents. when it states "America was founded as a Christian nation".... who found it? your picking at this one too hard when the answer is obviously in front of you. read of the pilgrims and when they came overe here on the mayflower and the santa maria. youll understand then. they are the ones that claimed america and established it as their own. then, a few generations later, thats when the colonies were forming into towns and states and the presidents started forming a government.
2006-07-18 05:56:30
·
answer #6
·
answered by Anonymous
·
0⤊
1⤋
Good question. If you really look back to the first americans, The Indians, they were not Christian or any other form of conventional religion.
However the majority of the first foreign settlers here were of the Puritan faith. I guess people normally assume that when you speak of the founding. fathers and how they spoke of God and His son they put it together as being Christian.
2006-07-18 05:54:29
·
answer #7
·
answered by CARL Z 2
·
0⤊
0⤋
Because they have to promote their stone age belief system and the easiest way to elevate it to a more important position is by rewriting history. Since people in America are typically ignorant regarding early American history, it's an easy lie to get away with.
2006-07-18 05:57:02
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
because the first settlers were protestants trying to escape the king's rules to make them be a part of the church of england. But, I don't care about their religious beliefs, the point was they needed freedom to pursue their beliefs. That's why they came to the "New world" . And, that's why people come here still. I am an atheist, and if any religious people tried to break down my door and drag me out to turn or burn, I'd shoot them where they stand, because this is America, and no one is allowed to attack you because of your beliefs.
2006-07-18 05:58:27
·
answer #9
·
answered by Anonymous
·
0⤊
0⤋
If ppl did not even bother to read thier own book of GOD bible. Do you expect them to read anything on history?
THe whole idea when US was established is to breake from Church and thier dogmas. Pppl knows very little about their own history.
Therefore GOD step down when US was created.
2006-07-18 05:52:25
·
answer #10
·
answered by PicassoInActions 3
·
0⤊
0⤋