Not anymore. We are a "post-Christian" nation. Our national spiritual origin is Christianity and most of the Founding Fathers and colonists believed and lived according to the Christian faith. The educational and political system was created assuming the parent's beliefs and values would be passed down to their children.
However, during the 50's and 60's a huge moral and spiritual breakdown was caused by many factors, including a loss of belief in the government due to Vietnam War decisions, a loss of trust in parental wisdom and authority, and rebellion and drug culture. These "me" generation graduates threw off the value systems of their WW2 and Depression era parents, and were open to anything that felt good and didn't cause them to feel guilt.
Although there are bastions of America's Christian past, they are for the most part overshadowed by judges who care more about personal rights than personal responsibility, parents who care more about their reputation than their children's behavior and attitude toward authority, and people who care more about their own needs and wants than how their decisions affect others.
Whether good or bad, the advent of diversity has changed our nation.
2006-10-07 06:23:16
·
answer #1
·
answered by Mmerobin 6
·
1⤊
1⤋
No... the USA is NOT a Christian nation. Our Founding fathers went to great lengths to assure that would be the case. The 'law of the land' was not based on any Christian or biblical doctrine or writings... it was rooted in the secular humanist ideals of the 'Age of Reason', and based on 'The Code of Hammurabi', English Common Law and the constitution of the Iruquois Confederation. The 'Treaty of Tripoli' (June 7, 1797) specifically states, in Article 11: "As the Government of the United States of America is not, in any sense, founded on the Christian religion; as it has in itself no character of enmity against the laws, religion, or tranquillity, of Mussulmen (Moslems); and, as the said States never entered into any war, or act of hostility against any Mahometan nation (Islam), it is declared by the parties, that no pretext arising from religious opinions, shall ever produce an interruption of the harmony existing between the two countries." This treaty, unanimously approved by the Senate and signed into law by John Adams only a few years after the ratification of the Constitution, is taken by constitutional scholars to be a clear and unambiguous declaration of the intent of the founders.
2006-10-07 06:17:45
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
We are not quite a Theocracy despite the wishes of a few insane people. We do have a large portion of Christians, many of which agree with the notion that religion should be separate from State. At the same time Religious institutions do have a lot of pull on the US.
I would say that we are strongly influenced by Christian dogma.
2006-10-07 06:15:22
·
answer #3
·
answered by Anonymous
·
1⤊
0⤋
This nation was founded by deists, NOT by Christians.
If they had been Christians, they would have included the name of Jesus in their official documents.
Just like many people in this nation THINK they are a Christian, but they are not. Most of you who THINK you are a Christian, don't even know what being a Christian really means.
2006-10-07 06:55:30
·
answer #4
·
answered by Born Again Christian 5
·
0⤊
1⤋
America seems to have a larger percentage of professing christians than other nations, but that doesn't necessarily mean that the nation is christian. We will all be judged individually.
2006-10-07 06:18:54
·
answer #5
·
answered by John 4
·
2⤊
0⤋
Not fundamentally so. I think a lot of people believe in the Abrahamic God. They might go to church once in awhile, they believe, but it's just something they were raised with as opposed to religion being their entire life. I do think there are larger numbers of atheists, Pagans, and minority religions than we know. Plus, I think the founding fathers came from various backgrounds, Christian and other. So - no. We're not a Christian Nation.
2006-10-07 06:16:06
·
answer #6
·
answered by swordarkeereon 6
·
0⤊
1⤋
80 one % identity selves as Christian. Turning cheek has purely to do with seed planting. God would not prefer actuality compelled on human beings. If an unbeliever asks a seed planter to give up a coaching and the seed planter keeps and gets smacked, he's to show cheek (did not do it Christ's way and merits the smack). If a bully on the streets hits a christian, we whack them with each and every element we've been given (see John 2:14-sixteen the place Christ beat the hell out of a few money changers set up in Church).
2016-11-26 23:03:19
·
answer #7
·
answered by ? 4
·
0⤊
0⤋
This part of Yahoo! Answers is available to several English-speaking countries like the US, parts of Canada, Great Britain, India and Australia. I will answer for my own nation, the US.
No. We're a diverse nation.
2006-10-07 06:13:33
·
answer #8
·
answered by anyone 5
·
0⤊
0⤋
I'm talking about America-fyi
i think we r a religious nation, but not necessarily a christian nation. the constitution and bill of rights were founded on morals, and ur morals are derived from what your religion is.
2006-10-07 09:28:26
·
answer #9
·
answered by Ginny Weasley 2
·
0⤊
0⤋
If you are referring to the United States, no, I don't think that we are a Christian nation. Most of the founding fathers were Christian, and today there are a lot of Christians in America, but I think we've drifted from our Christian roots. If this were not the case, pornography, murder, abortion, alcoholism and drug abuse wouldn't be as common in America as they are.
2006-10-07 06:20:42
·
answer #10
·
answered by David S 5
·
2⤊
1⤋