I ask this question to build on others in the debate on whether or not the U.S. government and the nation was founded on Christian principles.
References are made to the Pledge of Allegiance (to which the phrase "under God" was later added), the motto "In God We Trust" on our money, and references to God and natural law in documents like the Declaration of Independence and the U.S. Constitution.
But why are folks so quick to claim that these references are "Christian", particularly when many of the "founding fathers" were Deists?
And why are documents like the Treaty of Triopli, which clearly states that the government of this country was not founded on Christianity, conveniently ignored?
If I say that I believe in God, does that make me a Christian?
I think not - and I think that many Christians will only reassure me of that fact.
Do Christians have some exclusive right to God?
I think not - but I know that some Christians argue that they do, through Jesus.
2006-08-21
12:17:41
·
26 answers
·
asked by
Anonymous
in
Society & Culture
➔ Religion & Spirituality
Mr. Mister: I didn't ask if believing in God made one a god, so I don't see the comparison to asking about hot dogs.
2006-08-21
12:27:55 ·
update #1
Brahden: George Washington, Benjamin Frankling, and Thomas Jefferson are all considered three of the "founding fathers" and they were Deists.
2006-08-21
14:31:32 ·
update #2
Deists or not, the seperation of powers in the government constituted in Philidelphia was a model of a scripture from the old testament. I look it up for ya. Jefferson was a well read man.
p.s. Most Christians, like myself don't come any where near living the ideal of Christ. But we do try...
2006-08-21 15:22:52
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
God is a generic word as easily used to reference Cernunnos, and Zeus as easily as Jehovah, Allah or an as yet unnamed deity. Too many fundamentalist christian leaders want it to mean only their way for one main reason, power. If important documents say something other than what they wish to believe, then well those documents are wrong or should even be hidden away. In the groups I work with, God rarely refers to Jehovah, but instead refers to any number of other ancient deities. All I am able to assume is the person has faith in a male deity and nothing more.
2006-08-21 19:39:02
·
answer #2
·
answered by Moonsilk 3
·
0⤊
0⤋
No I do not!!
Does saying you believe in God make you a Christian? No: it means you believe in God... which even the devil does!!!
Do Christians have some exclusive right to God... No: everyone does... through Jesus Christ!!! It's a matter of choice!!
'References are made to the Pledge of Allegiance (to which the phrase "under God" was later added), the motto "In God We Trust" on our money, and references to God and natural law in documents like the Declaration of Independence and the U.S. Constitution'... bla, bla, bla!!!
& then God is taken out of, & banned from all US Schools !!
2006-08-21 19:58:44
·
answer #3
·
answered by englands.glory 4
·
0⤊
1⤋
I don't really think like that. To me, if someone says they believe in GOD, then someone just said they believe in GOD. That's not to say I do assume they are a Christian, but that's not to say that I don't.
Think about that. Who do you know just walks into a room, says, "I believe in GOD!!!" and just stands there grinning like an idiot? In the rare chance of this happening, have they ever followed up with, 'Do you think that I am a Christian??"?
The human mind doesn't work like that. People don't pay that much attention to every little noise they hear. In fact, most people go to great pains to shut most of the noise out. Noone articulates a thought and forms an opinion on every little statement they hear. we just don't have that much attention to spare. The only time anyone ever finds themselves in a position where they would need to form an opinion after hearing or reading this statement would either be during a conversation involving GOD (at which time, they are likely to be given more than enough supporting information with which to form their conclusion) OR when they find themself on Fear Factor with the challenge of deciding which of the ten people that say they believe in GOD are Christians....... while eating the fur off the backs of day old road kill.
People just plain don't work like that.
Adder_Astros
Powerful Member of the House of Light.
[]xxxxx[];;;;;;;;;;;;;;;;>.
adderastros.com is temporarily down for renovation.
2006-08-21 19:45:38
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
I think you already know that the answer to your question is NO, lots of people believe in God, but continue to live in sin. The bible says you must be born again. Until you accept Christ as your personal savior then you are not a christian. Believing in God and being a Christian are two complete different things. God Bless You.
2006-08-21 21:02:36
·
answer #5
·
answered by Blessed 3
·
0⤊
0⤋
When you capitalize the "G" it usually refers to the god of Abraham. This is the same guy as Allah (Islam), The Lord (Judaism), and God the Father (Christianity).
More Americans are Christians than are Jews or Moslems. Therefore, in America, when you see the word "God" it's most likely referring to God the Father, of Christianity.
2006-08-21 19:30:48
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋
I would have to say that I do not automatically assume someone is Christian just because they believe in God. The Bible writer James, said something very interesting. “You believe there is one God, do you? You are doing quite well. And yet the demons believe and shudder.” (James 2:19) Obviously, the demons are not Christian. This scripture pointed out that more than just believing in God is necessary to have his favor.
I believe it is very important to find out as much as possible about God, regardless of what faith a person belongs to. Just like any loving parent, he has requirements for his children, and it is vital to make sure we know what these are so we are assured of his favor.
2006-08-22 19:30:52
·
answer #7
·
answered by izofblue37 5
·
0⤊
0⤋
NO, and that is the problem, people who believe in God are not nessassarily christians, you are a christian if you believe that Jesus died on the cross for your sins. Thats why Judism and Christianity are separate. They were founded on "christian" principles, but not nesaccarily by christians. I think a better term would be Biblical prinicples.
2006-08-21 19:27:37
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
No, just as i wouldnt assume somone who said they were a christian, is one. Some people believe in many different gods', just as somone believes in many different ways to be a christian. Just because somone is catholic doesnt make them a christian. Believing in God doesnt make them a christian. In order to be a christian we must be more specific in Christs' blood than to just say God, even though Jesus is part of the God head.
2006-08-21 20:27:42
·
answer #9
·
answered by Airman_P 2
·
0⤊
0⤋
Next time you call some of the founding fathers deist. Please provide names of the founding fathers that you speak of.
I do not assume when someone says they believe in God that they are Christian.
Jesus said He is the only way.
2006-08-21 19:57:11
·
answer #10
·
answered by Anonymous
·
0⤊
0⤋