It's not necessary and it's not right. I hope the US can avoid turning into a theocracy.
2006-11-13 11:17:39
·
answer #1
·
answered by ? 7
·
1⤊
0⤋
The US was founded upon Deism, not Christianity. Three of the first four founding presidents publicly denounced Christianity.
The concept of "deism" covers a wide variety of positions on a wide variety of religious issues. Following Sir Leslie Stephen's English Thought in the Eighteenth Century, most commentators agree that two features constituted the core of deism:
* the rejection of revealed religion — This was the negative or critical aspect of deism.
* the belief that reason leads us to certain basic religious truths — This was the positive or constructive aspect of deism.
Deist authors advocated a combination of both critical and constructive elements in proportions and emphases that varied from author to author.
Critical elements of deist thought included:
* Rejection of all religions based on books that claim to contain the revealed word of God.
* Rejection of the claim that the Bible is the revealed word of God.
* Rejection of reports of miracles and prophecies.
* Rejection of religious "mysteries" such as the doctrines of transsubstantiation, the Trinity, the Incarnation, etc.
* Rejection of the Genesis story of creation and the doctrine of original sin.
* Rejection of only the parts of the Bible that contain miracles, prophecies, or mysteries.
* Rejection of Christianity.
Constructive elements of deist thought included:
* God exists and created the universe.
* God wants human beings to behave morally.
* Human beings have souls that survive death, i.e. there is an afterlife.
* In the afterlife, God will reward moral behavior and punish immoral behavior.
Some Deists rejected the claim of Jesus's divinity, but continued to hold him in high regard as a moral teacher (see, for example, Thomas Jefferson's famous Jefferson Bible). Other, more radical, Deists rejected Christianity altogether, and expressed hostility toward Christianity which they regarded as pure superstition. In return, Christian writers often charged radical Deists with atheism.
As you will note, God is not mentioned once in the Constitution or the Declaration of Independence. A Creator is mentioned, but not the Christian, Jewish, Muslim or any other specific god.
It was never meant to be a Christian nation, but rather a nation where all religions may flourish of their own accord.
2006-11-12 19:14:23
·
answer #2
·
answered by doppelganger918 2
·
3⤊
0⤋
It is not necessary and contrary to popular belief no, this country was not founded by just Christians, most of the founding fathers where either agnostics or atheists. Benjamen Franklin and Thomas Jefferson, just to name a few.
2006-11-12 19:05:38
·
answer #3
·
answered by RoboTron5.0 3
·
8⤊
0⤋
The US -and Canada- were originally founded as Christian nations. Now the US is turning against God and sin is rampant -same with Canada. Once Christian nations deteriorate into complete immorality there will be very little hope for the rest of the world. If God doesn't judge North America He's going to owe Sodom and Gomorrah an apology. If you read The Rise And Fall of the Roman Empire you'll notice that North America is going in the exact same direction and is falling apart from the inside out. Those who do not learn from history are destined to repeat it. Check out your money some time and read what's written on it , In God we trust. Then there's God Bless America. There's no denying that the US -and Canada- were founded as Christian nations. Read the history books and you'll see that the forefathers of our two great nations, especially the US, were fleeing persecution for their beliefs and wanted a place where they could practise those beliefs in peace and safety.
2006-11-12 19:14:45
·
answer #4
·
answered by utuseclocal483 5
·
0⤊
4⤋
Necessary? It's not.
But the historical record of our Founding Fathers indicates that they founded the country...
#1 ...for religious freedom rather than live under the Official State Religion of Britain.
#2 ...and formulated its laws upon Biblical Judeo-Christian values.
#3 ...based on their own predominant Christian faith.
2006-11-12 19:15:04
·
answer #5
·
answered by Bobby Jim 7
·
0⤊
2⤋
There are some fundamentalist Christians whom believe it's a necessity to impose their doctrine upon all laws and issues in the United States (ex. prayer is school) It is just these fundamentalists or evangelicals that pursue not all Christians. But these are extremists, whom in their mind only their faith and their rationale is the correct way.
2006-11-12 19:55:20
·
answer #6
·
answered by Professor Bradley 3
·
2⤊
1⤋
In my eyes, it is not necessary. Diversity is a great thing (and yes I am a Christian).
2006-11-12 19:03:20
·
answer #7
·
answered by kristalshyt 3
·
9⤊
0⤋
Who said it was "necessary"?
The US was founded on religious freedom. It's one of the great ironies that it is turning into one of the most religiously intolerant countries on Earth.
2006-11-12 19:03:42
·
answer #8
·
answered by Bad Liberal 7
·
9⤊
0⤋
Maybe you should go and live in a non-christian country , then you'll understand and appreciate living in a "Christian" country.
2006-11-12 19:16:20
·
answer #9
·
answered by Anonymous
·
1⤊
2⤋
Years ago before people began being athiests more and other religion in the USA. We had God on everything even in our schools and in our text. This country needs to get back on Bibilical standards and put God back in America again. There is way to many freedoms in this country to the lost that aren't saved Christians. We should have a peaceful country not letting the devil's people get away with junk.
2006-11-12 19:07:45
·
answer #10
·
answered by Anonymous
·
2⤊
5⤋