English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2006-08-20 10:10:48 · 25 answers · asked by abdizel 1 in Society & Culture Religion & Spirituality

25 answers

It's definitely not a Christian nation, if it were, ppl would be behaving like Christians, not throwing God out of schools and promoting various unhealthy behaviors like promiscuity. It's a Christian country only in name, not in practice. Amazingly, the atheistic minority has taken a lot of control over the policies by arguing that any law promoting Christian ideals is mixing the church and state. Ironically, atheism is a religion itself, called secular humanism, they even have a paper outlining their beliefs, the Humanist Manifesto

2006-08-20 10:17:34 · answer #1 · answered by STEPHEN J 4 · 1 0

I don't believe the USA is a Christian country. I do, however, believe there are Christians living in this country.

2006-08-20 17:15:44 · answer #2 · answered by fragglerockqueen 5 · 0 0

The USA was founded as a Christian Nation.That's why our people came here all those years ago.
This is not my interpretation, it can be researched by all who don't believe what I am saying
We have embraced with open arms those who practice other religions also. Even though we were founded as a Christian nation, it isn't our desire to force others to believe what we believe. Again, that's why our people came here all those years ago. To flee religious oppression.

There is a quote by President George Washington that reads, " I believe it is immpossible to properly govern a country without God and the Bible".

I myself believe this is true

2006-08-20 17:25:04 · answer #3 · answered by kenny p 7 · 0 1

There are many different religions practiced in the USA. Most of those are Christian religions. But it is a secular country. We believe in the separation of church and state.

2006-08-20 17:15:43 · answer #4 · answered by Anonymous · 0 0

The USA has this thing called the separation of church and state, or atleast it's supposed to--you couldn't tell by looking. However, the majority is christian

2006-08-20 17:15:12 · answer #5 · answered by Alex M 2 · 0 0

Two Hundred Twenty plus years ago,our founding fathers devoted much of their lives to ensure "Freedom of Religion" in this country,and that is the way it will always be.No one is forcing religion on any one in this country.There are many beliefs in this country,not just Christianity.You have the right,here,to believe as you understand.Peace to You!

2006-08-20 17:22:23 · answer #6 · answered by Anonymous · 0 0

Hypocrites are the main part of the USA and any religion here because they do not practce what they preach.

Now I can not speak for America only the United States of America because I have never lived anywhere else in America.

2006-08-20 17:20:57 · answer #7 · answered by Don K 5 · 0 1

NOOOOOOO!!!!!!!! It's a country where the majority of people happen to be Christian. That does NOT make it a Christian country.

2006-08-20 17:16:04 · answer #8 · answered by First Lady 7 · 1 1

75% of the population is christian

2006-08-20 17:15:56 · answer #9 · answered by Grundoon 7 · 0 0

96% say "I consider myself a Christian" so yes it is by far a Christian Nation

2006-08-20 17:37:19 · answer #10 · answered by Anonymous · 0 0

fedest.com, questions and answers