English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I mean a practicing Christian nation. I say overall no. What do you say? And why?

2007-03-02 08:56:32 · 12 answers · asked by Anonymous in Society & Culture Religion & Spirituality

12 answers

I still think it is, although it is definitely moving away from that, slowly. I also think the world is becoming more concerned with material possessions, celebrities, image, self, etc. It's certainly not Leave it to Beaver anymore, when the focus was on the family.

2007-03-02 09:06:45 · answer #1 · answered by straightup 5 · 0 0

It's a Christian nation simply because 80% of the population are Christians. But in terms of it being the only religion in America then no it's not the only religion in America.

2007-03-02 17:10:08 · answer #2 · answered by Anonymous · 0 0

You can't base an entire nation on a personal choice. For America to be a Christian nation, EVERYONE would have to accept Christ, and they have to mean it. Otherwise it's not really Christian, is it?

So, no, America is not a Christian nation.

2007-03-02 17:04:23 · answer #3 · answered by ac28 5 · 0 0

It's a Christian nation in that the majority of people are Christian. But Christianity isn't the official state religion, so no.

2007-03-02 17:02:21 · answer #4 · answered by Anonymous · 1 0

George Barna from Barna research did a survey in the community, not just in the church. He asked people if they were a Christian and 90% said yes.

After asking them several questions that were foundational to Christianity, such as Is Jesus God, did he die on the cross, are you a sinner saved by grace, he concluded from those interview that only 6-7% are truly Christians.

This is a Study from one of my husbands college courses that he did just a year ago. If these statistics are accurate, then Christians are in the minority, not the majority

2007-03-02 17:06:59 · answer #5 · answered by cinderella9202003 4 · 1 0

Their is no such thing. You cannot call America a Christian nation now nor has it ever been one. The majority of people don't believe in God at all and that has always been the case.

2007-03-02 17:01:31 · answer #6 · answered by Anonymous · 0 0

No There are churches everywhere but I think it's starting to lose that edge. People are starting to became may different religions. Like for example, A few months ago, which isn't that long ago if you think about it, there was an article on Yahoo that talked about the rise of Buddhist in America. I think people are starting to look at different choices.

2007-03-02 17:05:53 · answer #7 · answered by honeyluvsyou2004 2 · 0 0

no because america is all about accepting ones religion and not judging based on that. we do have a large amount of Christian's here but that doesn't mean its a Christian nation

2007-03-02 17:01:35 · answer #8 · answered by ♥katie♥ 3 · 0 1

No. I don't.

However....

Since you don't allow email, I suppose this format will have to do.

You wrote:

"But we American Christians are persecuted in subtle ways - slander; not given equal positive voice in mainstream media; not allowed to pray in school or share our faith.
We don't want to control people. We just want the right to live out our faith which includes sharing it with others."

Specifically, how is it that you think Christian children are not allowed to pray in school? Or that the media does not reflect liberal Christian values? And please, living out your faith does not require converts. Leave us alone, and you will be criticized less.

2007-03-02 17:11:41 · answer #9 · answered by manic.fruit 4 · 0 0

the majority, yes, unfortunately.

but things are getting better. I guess that's why Christians are getting so angry.

2007-03-02 17:02:53 · answer #10 · answered by kent_shakespear 7 · 0 0

fedest.com, questions and answers