English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

At this time of year, with the country saturated with Christmas imagery, it can seem that they are right. Are they? Is America a "Christian nation"? Should it be?

2006-12-21 10:10:58 · 24 answers · asked by PATY.P 2 in Society & Culture Religion & Spirituality

24 answers

nah its not a christian nation.
Its a commercial nation :)
Its easier to just be happy that you have time off work and that you are getting gifts etc than to stand up and say "omg this is stupid"
Back in the day it was popular to be religious.. it was a big town gathering and a power base to hobnob with everyone. And if you weren't seen there then there was something wrong with you.
Its just fashionable to be in the largest group.
There are way too many points of view for this to be a "Christian" nation.. and most Christians aren't even following the Christian ideals either.
Having a entire religious fundimental group as a nation is stupid

2006-12-21 10:14:48 · answer #1 · answered by Prof. Timpo 3 · 0 0

That depends on what the criterion is required for it to be a "Christian nation." I feel that it is NOT a Christian nation, because it was not founded on Christianity (or any other religion for that matter). The Treaty of Tripoli clearly clarifies that.

http://www.nobeliefs.com/Tripoli.htm

However, many believe that America IS a Christian nation because people who consider themselves Christians are largely in the majority, at a whopping 75%.

http://www.religioustolerance.org/christ.htm

I feel that America should NOT be a Christian nation, or an atheist nation, or even a Pagan nation. America is not governed by any particular religious faith, and it should remain that way. When religion and government are one, sh*t happens. We can learn that from history. I enjoy my freedom of religion, and I think everyone else here does, too. The only discrepancy is that some people would rather OTHER people not have it. If that is the case, then there are plenty of countries already established with one specific governing religion. Those people should consider relocation to one of those countries.

2006-12-21 19:02:37 · answer #2 · answered by Lady of the Pink 5 · 0 1

No, it's not - and no, it shouldn't be. This has always been a secular nation, and must remain so for our basic freedoms to stay intact.

For more info, see the First Amendment and the Treaty of Tripoli ('as this is not, IN ANY WAY, a Christian nation ...').

Edit: This country was founded in the 1700's. God on the money and in the pledge didn't start until 1950. The founding fathers were mostly atheists and deists, not christian. This country is NOT based on 'christian principles', unless for some reason christianity is the only religion or group that EVER came up with the idea that we shouldn't kill or steal.

2006-12-21 18:20:19 · answer #3 · answered by eri 7 · 0 1

Jesus asked us to love God with all our heart, and love our neighbor as ourself; to forgive those who do harm against us, and love our enemies.

I meet a great many fellow Christians who strive to do these things and more. Still, America as a whole does not care for the poor anywhere close to the level Jesus required of us.

And as a nation, we seem so often to be concerned with power and accomplishment, control and aggression, rather than compassion and kindness.

Hopefully Christians can begin to demand more from our government, so that The United States can become a nation that cares about creating peace in the world, and justice for the oppressed, more than anything else. In other words, maybe we are not a Christian nation, but perhaps we should strive to be a "Christ-like nation."
peace

2006-12-21 18:20:59 · answer #4 · answered by Colin 5 · 1 0

It is only a Christian nation if 100% of the citizens are Christians. Only 85% are... so America is not a Christian nation. End of discussion.

2006-12-21 18:14:13 · answer #5 · answered by Anonymous · 0 0

No, our forefathers were Deists, believing in the moral teachings of the Bible, but not the miracles, such as creation and the virgin birth.

Christmas has only been an official holiday since 1890, but didn't really grow in strength until the time when our government was trying to make us afraid of communism. Anyone not observing it must be a communist was the thinking of the time.

Frankly, since God says that he intends to destroy all the nations of the world, I rather doubt any nation can be considered Christian.

2006-12-21 18:25:04 · answer #6 · answered by Anonymous · 1 1

The majority of the country is either Christian, or considers themselves to be Christian, and our founding fathers considered us to be (check out the how they referenced God in the Declaration of Independence). I'd say that percentage wise, we are a Christian nation, even if some people don't always act like it.

2006-12-21 18:18:51 · answer #7 · answered by Cylon Betty 4 · 1 0

A Christian nation? Yes.

A nation founded on Christianity? No.

A nation whose government is based on Christianity? No.

2006-12-21 18:13:22 · answer #8 · answered by . 7 · 2 0

it shouldnt be, the country was founded on free religion. people came here to escape religious persecution. many of the founding fathers were free-mason..

its a free-religion nation, that contains christians.

if u put butter in a sour cream container, it wont make the butter sourcream. what i mean is, just because there are christians in the us, does NOT make the us a christian country.

2006-12-21 18:15:05 · answer #9 · answered by Anonymous · 0 0

I think it is becoming more and more diverse. With the next generations, a lot is going to change. Many of the American political leaders are very religious or at least have some religious influence that affects their work. I think that in the next few decades, a lot will change as more Americans become involved with their government.

2006-12-21 18:14:20 · answer #10 · answered by Anonymous · 1 0

fedest.com, questions and answers