It was not founded as a Christian country. It was founded on Christian ideals, but it was allowed that everyone can worship as they see fit.
2007-04-09 05:54:54
·
answer #1
·
answered by Kevin C 4
·
4⤊
2⤋
This question can be confusing because people mean different things by "Christian nation". I assume you are asking whether w think the system of GOVERNMENT is itself "Christian". To that, I would say no, and I think most American would agree.
But there are OTHER senses of "Christian nation" that might apply.... if you mean something looser, like a society whose shaped by Christian values, or even a GOVERNMENT whose values are strongly rooted in or shaped by Christian beliefs (if only because these beliefs informed the thinking of those who wrote the laws)..
Now there are those who have suggested that all or nearly all the founding fathers were Christians and explicitly based the government on Christian beliefs. That is not so.
But NEITHER is the claim some have made here that the founding fathers were all or mostly deists, closet atheists, etc. There certainly WERE some deists amongst them, but that does not mean that THEY attempted to shape the government by THEIR religious beliefs. And many more were Christians (esp. if you count ALL the early members of Congress, etc) . Again, that doesn't mean they were trying to establish some sort of theocracy! (Don't know why people seem to think it must be one or the other.)
It would take a good look at both the statements and ACTIONS of these people to know in what sense, if any, they conceived of the government as being shaped by or at least to be esp. friendly to Christian beliefs. (Example of SOME relevant practices -- use of chaplains, opening sessions with prayer.)
2007-04-09 15:51:51
·
answer #2
·
answered by bruhaha 7
·
0⤊
0⤋
Pennsylvania and other New England colonies were absolutely founded as religious settlements. There were, therefore, strong puritan ethics and religious echos placed into our Nation's founding by the representatives from those states. The Crown Colonies, however, like Virginia and the Carolinas, were founded as economic ventures. Therefore, the representatives from those states brought a more secular point of view with them to the continental congress. And, although our founding documents often mention God, they do not mention any specific denomination. I think individual liberty (for white landowning males) was more important to our founding fathers than any particular religion. I don't mean that comment about white land-owning males as a put down, either. If they hadn't established the notion of individual liberty for themselves, it could never have spread to all other segments of our culture. Sure, it's taken over two hundred years, and the results still aren't perfect (and are under attack from the current administration), but considering the previous five thousand years of human history, I'd say our founding papas did a pretty good thing in getting the ball rolling.
2007-04-09 05:50:11
·
answer #3
·
answered by Rico Toasterman JPA 7
·
2⤊
1⤋
I say NO.
First of all most of the Founding Father WERE NOT CHRISTIANS.
Most were deints or Masons that only believe in a Superior Being not the Christan God and Jesus.
Also The First Amendment: Congress may not Establish a state religion.
Also the US signed a Treaty after the war with the Libyan Pirates. In Article 11 and I quot, " THE US IS NOT AND WILL NEVER BE A CHRISTAN NATION." and the Constitution states that we uphold all treaty's that we sign.
The Constitution does not have god in it.
Watch the youtube movie Is this really a Christan nation?
2007-04-09 05:53:40
·
answer #4
·
answered by MG 4
·
1⤊
1⤋
Our country was not really founded by England back then by the pilgrims, we were merely colonies, we were founded to be free of the England empire after the revolution.
So no, Americans dont believe our country was founded on the christian religion, but a freedom from England, we were not a country before the revolution.
People only wanted freedom from england, it wasn't really freedom everything, other freedom were given after discussion, slavery was still going on. Christian idealism is what our country became centered in.
2007-04-09 05:48:53
·
answer #5
·
answered by kevin p 3
·
0⤊
1⤋
I'd give that a Yes and a No, personally. While it's true to say that the first Europeans to arrive in the New World were Christians (although I have an issue with this, because in 17th-century Europe it was distinctly dangerous to be anything BUT Christian, so there's a chance that a percentage of the new arrivals had just been playing along, if you see what I mean)...while that's true, it must also be pointed out that they'd made that arduous journey in order to get away from the ruling Christian orthodoxy of the day, who'd made their lives completely unbearable.
Given that, I have reasonable suspicion to cast upon the claim that they'd take kindly to their descendants berating and squelching other faiths even to this day.
2007-04-09 05:46:52
·
answer #6
·
answered by dorothea_swann 4
·
2⤊
2⤋
Now as then they were religious extremeists.
Now before all you Americans get up in Arms look at our history. They left England because they were religously persicuted in their home countries and left to find a place to practice their religion there way so at the time they were considered religous extremeists
2007-04-12 08:58:31
·
answer #7
·
answered by Mark M 4
·
0⤊
0⤋
Most of the founders were Deists, not Theists. This means that they assumed that a Creator must be necessary, but they did not believe that this Creator had revealed himself in any allegedly holy book or inspired writings. You get to know him through nature, not through words.
The founders of the American Republic were very far from being anything like the fundies of our time. As Deists, they were with Voltaire, Rousseau and the like. In terms of today's language, they were pretty much religious Humanists, going along with allegations of the existence of a creator God but making the well-being of humans the measure of what is the best thing to do in making choices as they designed the new Republic.
2007-04-09 06:01:49
·
answer #8
·
answered by fra59e 4
·
2⤊
1⤋
No. It was founded on enlightenment principles. The founding fathers we remember most for their revolutionary ideals were a mix of deists and closet atheists who believed that the government establishing a religion would lead to more harm than good. They had plenty of examples of that from Europe.
2007-04-09 06:03:33
·
answer #9
·
answered by K 5
·
2⤊
1⤋
Yes, most Americans will tell you that we were founded as a "Christian" country. (I am American and a Christian, by the way)
They base this off the idea that
1.) the "Pilgrims" were European Puritans who fled here so that they would practice their particular brand of Christianity without interference of the sort they expeirenced in England especially.
2.) The Declaration of Independence refers to a Creator.
I happen to bristle at the idea that the US was founded as a "Christian" nation. Many of the founding fathers were "deists," who, while they certainly believed in an intelligent higher power, did not necessarily accept that Jesus was his Son.
The US Constitution's first amendment specifically bans the establishment of any specific religion as a state religion, and guarantees the free practice of any religion within the US.
Pennsylvania, for example, was founded with the specific intent of being a haven to those religions that were feeling oppressed within their own lands.
So in short, the US was not founded as a christian country. It was certainly founded with the belief of a "God" in mind, though what religion that specific God is God of is left ambiguous. However, most people within the US would identify themselves as Christians.
2007-04-09 05:52:49
·
answer #10
·
answered by Monc 6
·
1⤊
3⤋
It was founded as a country of religious freedom and tolerance.
2007-04-09 05:58:28
·
answer #11
·
answered by Jackie Oh! 7
·
1⤊
0⤋