It is a Christian nation because it is 80% Christian and
Christians hold all the power.
It shouldn't be because there is Freedom of Religion.
2007-02-21 07:52:46
·
answer #1
·
answered by ? 3
·
3⤊
0⤋
The Establishment of a religion by the Government is banned in the First Amendment.
The United States is a Republic. The only Republic prior to the US was the Roman Republic in Pagan Rome.
The United States has a representational Democracy. Democracy is a pagan Greek political system, and is not compatible with Christianity.
There are no aspects of Christianity incorporated into the Constitution or any other foundational aspect of the UNited States.
Practice your religion as you want, but to attempt to alter the Foundational principals of the United States shows that you really do not understand what America is about, and maybe you should go find some nice middle eastern country to live in.
Most of you religionists want to get closer to your God? Then go find him where Moses found him, in the middle of the desert on the northern tip of Africa.
Don't let the door hit you in the bum on the way out.
2007-02-21 16:02:20
·
answer #2
·
answered by Anonymous
·
2⤊
0⤋
Perhaps there is some clarification needed in your question. I would say without reservation that this country was founded in Judeao-Christian values; however, to say that this country is a Christian country today is a far more difficult answer to get to.
I respect your right to say what it is that is on your mind and I expect the same from you. That in an of itself is a Christian value and is more or less part of the fabric of our culture. At it's routes, Christianity is a good thing. I am not hear to spread the gospel or even convert you into a Christian. I am merely here to so that we are and always be a Christian nation.
Put very very simply, If the American people have been brought up in this Christian society and the American people are the fabric of this great nation, then is it not logic that Christianty is interwoven into American Culture?
Your point about removing Christianity from America is disheartening to me. Is it so bad to live in the way Christians do? Even if you do not beleive in Jesus or any God for that matter, is it so bad to treat others as you would be treated, love others, forgive others and be of service?
Lastly I will say that if you were to poll the America public, I think you would find that Christianity is deeply and irreplacably imbedded into our great republic.
Best of luck in your search for answers.
:)
2007-02-21 16:02:14
·
answer #3
·
answered by TheGarlicButterSaw 3
·
0⤊
1⤋
America is secular nation and therefore can't be described by any religion. If you are talking about the country that people call America that I'm living in right now, however, it seems to be steadily evolving into a theocracy of some sort. Slowly Ten Commandments and Mayflower Compact Monuments are cropping up on courthouse lawns in Christian fundamentalist communities, and the people who try to oppose them are being ostracized by their communities if they try to stop it. I think that these are just the first small steps in destroying our religious freedom, and then later our political freedom.
History has shown us that when governments are given the power to oppress the people they will eventually use the power to oppress the people, and religion is a highly effective psychological tool for populace control.
The only way to protect our religious and political freedom is to join together with others who believe that America was meant to be secular, and therefore open to all religions and philosophies, and make our voices heard in peaceful and creative protest.
2007-02-21 16:15:20
·
answer #4
·
answered by l m 3
·
2⤊
0⤋
Well unfortunately America is a Christian nation as the overall numbers would state so and of course according to Bush Sr. and Bush Jr. Anyway this nation shouldn't be based on one particular religion. This nation is supposed to be for everybody of all different beliefs. This nation was founded for the purposes of liberty and justice for all. This nation shouldn't have some of the biases that we do have. The best thing that can be done to make sure this nation is ran well would be to keep religion and politics separate from one another. You know it's funny to me because I always thought that this nation was made to be secular.
2007-02-21 15:58:24
·
answer #5
·
answered by Anonymous
·
1⤊
0⤋
America is a chrisitan nation because of it's history, despite the fact many of our own founding fathers had no religion, and left england so they would be victims to religious crimes.
American should not be a christian nation because despite the fact we boast "freedom of religion" we don't take it seriously, and bash on those who do not believe in an American God/Jesus, or any at all.
I don't think there is anything we can do to keep the nation from being a christian nation. It already is, and whenever someone sounds the alarm when there is something wrong, Christians and churches are the first to jump all over it.
2007-02-21 15:54:55
·
answer #6
·
answered by Blanca 3
·
1⤊
1⤋
No it is not a Christian nation.
USA was founded upon the enlightened European ideals of Thomas Paine, John Locke, Voltaire, etc. That all men had rights and that no ruler had power except that which was bestowed by the people..which meant that governmental power was subordinate to the power of the people (not a Christian value or ideal).
The Founding Fathers even DECLARED in public, to the world that America was NOT a Christian nation in the Treaty of Tripoli which outright stated that the USA was in no way founded as a Christian nation. here it is:
"Article 11. As the Government of the United States of America is not, in any sense, founded on the Christian religion; as it has in itself no character of enmity against the laws, religion, or tranquillity, of Mussulmen; and, as the said States never entered into any war, or act of hostility against any Mahometan nation, it is declared by the parties, that no pretext arising from religious opinions, shall ever produce an interruption of the harmony existing between the two countries"
The Treaty Of Tripoli was authored by American diplomat Joel Barlow in 1796, the treaty was sent to the floor of the Senate, June 7, 1797, where it was read aloud in its entirety and unanimously approved. John Adams, having seen the treaty, signed it and proudly proclaimed it to the Nation.
It should in NO WAY EVER become a "Christian Nation" and to do this we might try to be a smarter, more decent species who cares about the rule of law instead of howling at our officials to break/cancel/make laws to suit our own bigotted ends
2007-02-21 15:56:26
·
answer #7
·
answered by Anonymous
·
2⤊
0⤋
No I don't think America is a Christian Nation (it is a post Christian nation).
Yes I believe it should be one.
1. America is no longer a Christian nation because we emphasis grass and recycling instead of what matters like the family. And we teach evolution instead of the Bible etc.
America still has Christian traits but we are quickly getting rid of them.
2. I don't think Christians have much choice outside of the US where they can beautifully live their life as they believe they should. All other countries are too liberal and secular or too Islamic to live in for our liking.
2007-02-22 00:50:07
·
answer #8
·
answered by Samuel J 3
·
0⤊
1⤋
It is a nation where you can freely worship whatever religion, however christians are the majority by far and have a big impact on this nation. We shouldn't become a christian nation however, because that would turn us into the same as the Islamic countries, it would also lower our education system, and it would turn us all into sheep
2007-02-21 15:56:33
·
answer #9
·
answered by Anonymous
·
1⤊
0⤋
Most of America's founding fathers were men of faith, but they also clearly believed that one's faith is a very personal matter, something to be decided by each individual according to his or her conscience. They went out of their way to construct a secular constitution and a First Amendment that would ensure religious freedom for all.
No, I don't believe the U.S. is a Christian nation. Currently a majority of its citizens are Christians, but that's not the same thing. A majority of our citizens are Caucasian and female, but that does not make this a "white woman's country." The United States was based upon Enlightenment views of the proper roles of government and respect for personal freedom.
How to keep it from becoming a Christian nation, or Muslim nation, or Scientologist nation? Respect the First Amendment and maintain the separation of church and state.
2007-02-21 15:56:06
·
answer #10
·
answered by Anonymous
·
1⤊
0⤋
What happens to the Jews in a Christian nation? (I'm just limiting the question to Jews because that's what I am) Do we live here by special dispensation by Christians? If this is a Christian nation, then do Christians get to decide who is allowed to live here and practice their religion? Do you want a theocracy like they have in the Middle Eastern countries?
Bumper sticker: The last time we mixed politics and religion, people got burned at the stake.
2007-02-21 16:07:21
·
answer #11
·
answered by Anonymous
·
2⤊
0⤋