English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

At least in our values, is America still, fundamentally, a Christian coutnry? What about in practice?

2007-01-17 14:25:30 · 25 answers · asked by YourMom 4 in Society & Culture Religion & Spirituality

25 answers

America never was a Christian country and, we should hope that it never will be.

2007-01-17 14:30:44 · answer #1 · answered by Anonymous · 3 3

In our values, America isn't any more Christian than Pagan. Egyptians had codified values quite early on, and what worked was simply adopted by cultures as they mingled.

If you think that the United States is based on Christian values, try spending some time with history. Read the treaty with Tripoli, signed by George Washington. Here is article 11:

ARTICLE 11.
As the government of the United States of America is not in any sense founded on the Christian Religion,-as it has in itself no character of enmity against the laws, religion or tranquility of Musselmen,-and as the said States never have entered into any war or act of hostility against any Mehomitan nation, it is declared by the parties that no pretext arising from religious opinions shall ever produce an interruption of the harmony existing between the two countries.

In practice, we're trying to remain moral, and the only effect religion plays is that which individuals adhere to. As a Pagan, my morals are of no less worth than those of my Christian neighbors. In some individual areas we may disagree as to what is right or wrong, but in general, we are in agreement.

Atheists, Pagans, Christians, Muslims ... all have morals, and in general, these are what we use to maintain order and law in a country. Being non-Christian does not imply anarchy.

2007-01-17 22:38:22 · answer #2 · answered by Deirdre H 7 · 1 0

The Christian and non-Christian founders of the United States founded a secular country, one in which very individual can chose to believe or not as he or she wished. Some will tell that the Pilgrims founded America as a Christian nation, but what they founded was Plymouth, an English colony. The United States was founded In 1776, with our Declaration of Independence from English rule.

2007-01-17 22:37:49 · answer #3 · answered by Dawn G 6 · 0 0

No. The United States of America is a secular constitutional republic, not a theocracy.

80% of Americans claim to be Christian, but this does not make the NATION Christian.

2007-01-17 23:43:20 · answer #4 · answered by Kathy P-W 5 · 1 0

If some christians had their way we would be like the middle east with no separation of church and state. We are a secular country where everyone has the right to believe or not believe anything they want without persecution. Religion takes over when good people do nothing.

2007-01-17 22:32:41 · answer #5 · answered by Vlasko 3 · 1 0

No, absolutely not. The majority of America is Christian, but we are in no way a Christian country. What makes me proud of the USA is that it's made up of all different kinds of religions, races, beliefs, etc.
No one religion should control a country. And there are no such things as Christian values. They're just human values, and people put a label on it. Like if I went "I don't like to kill puppies... killing puppies is bad. OK, let's call that an Amberistic value." Makes no sense.

2007-01-17 22:33:46 · answer #6 · answered by ....... 4 · 1 1

To a considerable extent, yes. Our belief in freedom of religion, justice for everybody, equality, come from the Christian faith as understood by the colonists.
Even now, when many people don't go to church, their values of generosity and fairness come from Christianity.

2007-01-17 22:37:15 · answer #7 · answered by The First Dragon 7 · 1 0

We are a secular country with Christians majority that is in charge. In practice sometimes and other times not. It really depends on your definition of Christianity. Christian values differ among there many sects.

2007-01-17 22:31:59 · answer #8 · answered by Anonymous · 0 2

Are we a Christian country?

not anymore.

we are a nation that wants to do what we want and refuse any correction or standard. in essence we refuse to be accountable.

we as a people only care about appearing to me morale most don't even care about that anymore.

there must be a void , a deep stench of depravity & debauchery before people begin to relize how bad things are. then the heart of this country will be ready for revival true revival!

2007-01-17 22:37:19 · answer #9 · answered by lewbiv 3 · 1 1

Except for the select few Waco's that hang out on Yahoo Answers, we are most certainly a Christian country. Let another 9/11 or devastating tragedy happen and see how many people drop to their knees and pray to the Almight God.

2007-01-17 22:34:45 · answer #10 · answered by Anonymous · 0 2

No. Never have been, and (hopefully) never will be. The Constitution is very, very clear on the point. The day this becomes a Christian nation is the day 90% of Christians will be ostrasized as heterodox.

2007-01-17 22:30:33 · answer #11 · answered by NONAME 7 · 2 1

fedest.com, questions and answers