English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

someone in a newspaper wrote today that this is still supposed to be a christian country?

What does this mean and futhermore - how can it be a christian country - if many non-christians live here.

I agree that it may be predominantly christian - but does that make it a christian country?

2006-10-26 10:18:44 · 29 answers · asked by aurora03uk 1 in Society & Culture Religion & Spirituality

29 answers

This is not a Christian country.

2006-10-26 10:21:20 · answer #1 · answered by trouthunter 4 · 1 2

Back in ye olden day the king/queen ruled roost and decided the religion of the country and ye head was hacled off if ye did not obey- now we are in 2006 and live in a democracy where we're supposed to be ultra pc and embrassing differences and cutures etc..thats in theory- unfortunatly in practice there was always be those stuck fast in ye olden times who just don't get it! It's incredibly frustrating- why can't we all wear what we want- practice what religion we want- beleive what we want- eat what we want- ahhhh! It's all madness isn't it- what is british? I mean hello there are black cristian in other countries, there are countries where the predominant religion is not christianity- what is wrong with the world I ask- sadly the answer is we are humans and as humans we are faulted. This debate will go on forever- I guess all we can do is keep speaking out and hope that more and more people understand- thanks for listening!

2006-10-26 17:29:19 · answer #2 · answered by Gemmafriend 2 · 0 1

I believe the thought behind this being a Christian nation is the idea that it was founded by Christians and that the Declaration of Independence as well as our entire system of freedom is based ont he notion that all men are created equal and given INALIENABLE RIGHTS BY OUR CREATOR. The very least this country could be accused of is being mere theist, but Id settle for that!

The Q is, where does the nation go from there? Do they shed off our Christian heritage with our traditional Judeo-Christian values in favor of a more secularized/atheistic liberal society where God has no role in morality or justice?

2006-10-26 17:34:21 · answer #3 · answered by Anonymous · 0 0

What do you mean by "A Christian Country"? I you go back to the very basics through history it was founded on the basis of a Christian philosophy......

Question to ask is what is Christianity suppose to be about...don't base the answer there on the media nor on the answers from this forum...this is to narrow a base and I think probably because people here do not have to face up to what they say do not take this 100% seriously and say things to try and stir up trouble....I also stumbled onto a group on here by accident who pretend to be Christian but are not and post things on here to make the Christian side sound "in their words dumb". Really ethical I know and it hurts those of us who are trying to use this as serious exchange of view.

2006-10-26 17:27:31 · answer #4 · answered by chico2149 4 · 0 0

While our founders wanted freedom of religion for every citizen, they also established Christianity as the officially "endorsed" religion....That is why the U.S. Supreme Court declared in the case of Runkel v. Winemiller (1799), “By our form of government, the Christian religion is the established religion; and all sects and denominations of Christians are placed on the same equal footing.”

Just like we are a democratic country (limited democracy) even though not every American citizen believes that democracy is the best system of government (and every citizen is entitled to that viewpoint). And so this is why our money does not say "in the Hindu gods we trust" or "in buddha we trust" etc....

Of course, in recent years, the Courts have dramatically redefined the separation of church and state to lead us to believe that we were never a Christian nation...

2006-10-26 17:25:14 · answer #5 · answered by whitehorse456 5 · 0 0

Agreed. I think you'll find that this is supposed to be a country of freedom from religious oppression, free choices and free will within the law. Yes, of course the religions around the world wish to lay claims on countries and citizens because, after all, they claim that it was their God who is responsible for them all. Most of these religious scripture-thumpers were not brought up with freedom of choice at all - they were probably brainwashed into their beliefs at an early age by parents and/or schools.

2006-10-27 12:12:49 · answer #6 · answered by Musicol 4 · 0 0

What I think that it means is the Biblical principles that this country was founded upon. Our constitution, our first amendments, the Bill of Rights all designed on Biblical principles. For example the Ten Commandments states that it is wrong to murder someone or to steal. Well you know what these issues are addressed in the constitution of th United States as well as all the states.

When our founding fathers drafted the Declaration of Independence and the Constition, they included religion in the form of religious freedom in them.

This is what it means by a Christian Country, not that all are christians, but just those principles of Life, Liberty, and the persuit of happiness..

2006-10-26 17:32:02 · answer #7 · answered by bro_ken128 3 · 1 1

It means that you are reading the wrong articles. If you are talking about the US, at least. These are the same type of theocrats that exist in Iran, Saudi Arabia, etc. - who think that the government should be run under the principles of their religion, and their religion only. The US is full of Jews, Hindus, Muslims, Mormons, Quakers, etc., and they would probably disagree with the thought that they are supposed to live under evangelical christian law.

2006-10-26 17:30:12 · answer #8 · answered by Chestrensen 2 · 0 0

Agreed. The predominate religion is Christian, but that doesn't make all of us who live here Christians. No one group can claim anything except that they have been blessed to live in a country where they are free to believe what they choose.

2006-10-26 17:28:27 · answer #9 · answered by buttercup 5 · 2 0

A predominantly christian country (practising and non-practising) that tolerates other religions including atheists. It's considered un-pc to state that but there's no disguising the fact. In a way I'm glad, anything less tolerant, islam for example, and I'd be called uncovered meat because I dont wear a head scarve (the mullah in australia who thinks women deserve to be attacked for not wearing one), which in turn denigrates decent law abiding men for being as primitive as they are portrayed to be by islamic standards.

2006-10-26 17:38:49 · answer #10 · answered by Anonymous · 1 0

I believe that the U.S. is considered a Christian nation because it was founded on mostly Christian principles. I do agree that our country is made up of many different religious professions and those who have none. We don't see the predominance of the Jewish religion effecting all parts of our society, neither do we see this with the Muslim faith as we do see with the Christian faith.

2006-10-26 17:27:20 · answer #11 · answered by brother g 2 · 1 0

fedest.com, questions and answers