The treaty of Tripoli, from 1796, states:
:As the government of the United States of America is not in any sense founded on the Christian Religion,-as it has in itself no character of enmity against the laws, religion or tranquility of Musselmen,-and as the said States never have entered into any war or act of hostility against any Mehomitan nation, . . ."
The text appears at http://www.yale.edu/lawweb/avalon/diplomacy/barbary/bar1796t.htm
2006-12-05
00:41:23
·
9 answers
·
asked by
Anonymous
in
Society & Culture
➔ Religion & Spirituality
The full name of the treaty is
"The Barbary Treaties :
Treaty of Peace and Friendship, Signed at Tripoli November 4, 1796"
2006-12-05
00:42:19 ·
update #1
No. While many countries do have official religions, the United States does not. From its founding, the United States has been a nation of religious freedom. While Christianity is the overwhelming majority religion, I would not say that this makes the United States a Christian nation. I'm sure there will be many outspoken opinions on this topic, but the bottom line is that factually the answer is no.
Shawn W.
B.S. History
Grand Valley State University, 1998
2006-12-05 00:46:48
·
answer #1
·
answered by Shawn W 1
·
1⤊
0⤋
From it's founding, the United States has never had an officially recognized religion. Despite the fact that the majority of the population is Christian, (though that is slowly changing) without an official state religion the U.S. can not be officially classified as a Christian nation.
The Treaty of Tripoli, as listed above, does have a clause stating that "the United States is not a Christian nation." Said treaty was signed and ratified both by the President and the U.S. Congress at the time of it's signing. By Constitutional law, all treaties with foreign governments have the full weight of law. Therefore, by law, the United States is specifically NOT a Christian nation.
2006-12-05 09:38:05
·
answer #2
·
answered by Lone 5
·
0⤊
0⤋
Our Nation was founded upon the Believe in worshipping God, Many of our fore-fathers was Christians, I know now many will say, no they were not, I have George Washington vision that God gave to him right here in my desk, But sad to say, since the USA has let many foreginers in our country our nation is on a down hill slide about what is really what our nation was founded upon.
2006-12-05 08:49:33
·
answer #3
·
answered by birdsflies 7
·
0⤊
0⤋
I believe it used to be a Christian nation. Not by fact of government but by fact that most people were Christians. With massive immigration and other facets of our changing culture there is a lot of non Christians in our country today.
2006-12-05 09:34:41
·
answer #4
·
answered by travelguruette 6
·
0⤊
0⤋
The US is considered a Christian nation.
2006-12-05 08:45:42
·
answer #5
·
answered by RB 7
·
0⤊
0⤋
It doesn't seem in the least Christian in its relations with its neighbors and in foreign policy.
2006-12-05 08:49:22
·
answer #6
·
answered by Bad Liberal 7
·
0⤊
0⤋
If they were, they'd be paying a little more attention to that commandment that says 'Thou shalt not kill'.
2006-12-05 08:42:56
·
answer #7
·
answered by Anonymous
·
1⤊
0⤋
Yes, I think so.
2006-12-05 08:42:10
·
answer #8
·
answered by Fredrick 1
·
0⤊
1⤋
they are athiest
2006-12-05 08:44:26
·
answer #9
·
answered by lostship 4
·
0⤊
0⤋