English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

mentioning god doesn't mean christ (many founders were deists)
majority being christian doesn't legally mean anything

2007-03-06 17:26:31 · 9 answers · asked by ajj085 4 in Society & Culture Religion & Spirituality

9 answers

They think it's their country because they have the majority.

2007-03-06 17:30:40 · answer #1 · answered by ChooseRealityPLEASE 6 · 1 0

Generally, when people say that America was founded as a Christian nation or on Christian principles, they are not saying that it has a Christian government or something like that. America is not a theocracy, and the Constitution prohibits federal laws that infringe on or establish religion, or that favor one religion or another.

What is true is that America, including its politics and culture, is strongly influenced by the people who settled it. Many of our land's first European settlers were Christians of one sort or another seeking a haven to practice their faith as they saw fit, since England at the time did not have legal separation of church and state. The British crown ruled by "divine right" and had a great deal of power over the church and religious practices. Look up information about King Henry VIII to see how bad this can be.

If you want to see evidence for Christianity's influence in America, you can find a variety of Christian phrases peppered in legal institutions and materials, such as the phrase "In God we trust" on US currency, or the existence of a "Creator" in the Declaration of Independence.
"We hold these truths self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights..."

It would be a gross oversight to say that America is an entirely Christian nation or that all Americans are Christians, but our government and our people have always had close ties to a belief in God. The important thing to remember is that America is indeed the land of the free, and that, both in word and deed, Americans have the right to believe what they choose, and to practice whatever religion they choose without the government's interference, so long as they do not endanger the rights of their fellow Americans.

2007-03-07 02:45:55 · answer #2 · answered by Barry D 2 · 0 0

It was. The first to come here were downtrotten and displaced Christians.

Once they made this place profitable, the less religious took over and took control.

But if you read the history books, the Congregationalists, the Presbyterianists, the Anabapitists and the Catholics were here LONG before the BURGERS of London, like Tommy Jefferson and Gerogy Washington and Benny Franklin, who were all English city snobs.

Had the United States been FORMED in 1690 it would have been declared a Christian nation.

By 1774 it was polluted by 50% English "carpetbaggers" come here to control and strike it rich in the environment the Religious had started, often with their own lives due to the harsh environment.

If you look at who controlled things it was not 3rd and 4th generation Americans it was 0 and 1st generation Londoners.

I'm not putting them down. They did WARRANTEE us Religious Freedom, but they didn't land here until Mahattan was a hub of commerice in the world.

Remember there was SUCH a majority of English Loyalists that they HELD BACK the process of independence for years.

The Declaration of Independence was NOT passed by VAST MAJORITY, it squeeked through.

2007-03-07 01:36:30 · answer #3 · answered by Anonymous · 0 2

It is desperation. America was founded as a nation of people free to worship whatever they chose, however they chose it. It bothers the "religious right" when others aren't Christian and won't allow themselves or their children to be forced to worship that mythological system and god structure.

2007-03-07 01:36:32 · answer #4 · answered by Huggles-the-wise 5 · 3 0

Actually, the founding fathers were either atheist or very secular.

They'd be rolling in their graves seeing these idiotic religious people trying to overrun our country.

2007-03-07 01:45:29 · answer #5 · answered by JP 7 · 2 0

No basis at all.
Check thee Bible. The USA are not even mentioned there.

2007-03-07 03:52:26 · answer #6 · answered by Anonymous · 0 0

Our Constitution, Bill of Rights and Declaration of Independence were founded according to the bible. Our very laws were written according to the bible.

2007-03-07 01:58:14 · answer #7 · answered by Anonymous · 0 2

here's a page listing evidence of America's pagan origins:

http://nobeliefs.com/pagan.htm

2007-03-07 01:53:47 · answer #8 · answered by answer faerie, V.T., A. M. 6 · 0 0

If you look into your history books you will see why. Try it open up a book.

2007-03-07 01:37:31 · answer #9 · answered by Anonymous · 1 2

fedest.com, questions and answers