I ask this question to build on others in the debate on whether or not the U.S. government and the nation was founded on Christian principles.
References are made to the Pledge of Allegiance (to which the phrase "under God" was later added), the motto "In God We Trust" on our money, and references to God and natural law in documents like the Declaration of Independence and the U.S. Constitution.
But why are folks so quick to claim that these references are "Christian", particularly when many of the "founding fathers" were Deists?
And why are documents like the Treaty of Triopli, which clearly states that the government of this country was not founded on Christianity, conveniently ignored?
If I say that I believe in God, does that make me a Christian?
I think not - and I think that many Christians will only reassure me of that fact.
Do Christians have some exclusive right to God?
I think not - but I know that some Christians argue that they do, through Jesus.
2006-08-21
12:17:41
·
26 answers
·
asked by
Anonymous