English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-05-14 09:59:36 · 35 answers · asked by Anonymous in Society & Culture Religion & Spirituality

35 answers

Americans are not all Christians, they have the freedom to worship as they please, as long as nothing they do infringes on the rights of others. America is a huge melting pot of people and ethnic backgrounds, traditions and beliefs. There is no way America could be implied as Christian when they are seen as anything but that. And, they are the most hated country in the world, so there's the answer.

2007-05-14 10:26:24 · answer #1 · answered by Hot Coco Puff 7 · 6 2

Since the thirteen colonies were established by the English (WASPs), and therefore our "founding fathers", the creators of the Declaration of Independence and the Constitution, were likewise of English origin (most of whose recent ancestors relocated to the colonies to experience religious freedom), this country's laws were generally based upon Christian principles, but, fortunately, by those with the genius and foresight to include prohibitions within the Constitution against the establishment of a state religion. However, as immigration to the new United States progressed, composed, at least in the Eastern States, almost entirely by European Christians, the great majority of Americans were (and still are) Christians who seem to believe, in fact, but fallaciously, that America is indeed a Christian nation.

2007-05-14 15:41:28 · answer #2 · answered by Lynci 7 · 0 0

No. I love getting scorned at by people from other countries when the U.S. is the only country in the world where the government is FORBIDDEN from imposing a particular religion. Great Britain is officially Anglican, Germans are required to pay a tithe, the Scandinavian countries are officially Lutheran.... the list goes on and on.... the US was founded upon freedom OF and freedom FROM religion - it is one of the greatest things about this country, and although I respect Christianity I oppose any effort by Christians to claim a preferred spot as Americans. You don't have one. You should and hopefully will never have one.

"The United States in is no sense founded upon the Christian religion." Treaty of Tripoli (and possibly George Washington)

"Revealed religion has no weight with me." Benjamin Franklin

"I do not find in Christianity one redeeming feature." Thomas Jefferson

"This could be the best of all possible worlds if there were no religion in it." John Adams

"I disbelieve all holy men and holy books." Thomas Paine

"Religions are all alike, founded upon fables and myths." Thomas Jefferson

"In no instance have churches been the guardians of the liberties of the people." James Madison

"The Christian god is cruel, vindictive, capricious, and unjust." Thomas Jefferson

"What has been Christianity's fruits? Superstition, bigotry, and persecution." James Madison

Whenever we read the obscene stories, the voluptuous debaucheries, the cruel and tortuous executions, the unrelenting vindictiveness with which more than half the Bible is filled, it would be more consistent that we call it the word of a demon than the word of God. It is a history of wickedness that has served to corrupt and brutalize mankind." Thomas Paine


"The Bible is not my book, nor Christianity my profession."
Abraham Lincoln

"I distrust those people who know so well what God wants them to do because I notice it always coincides with their own desires." Susan B. Anthony

2007-05-14 10:06:21 · answer #3 · answered by Anonymous · 4 1

Yes to people living in other parts of the world it does. I've been to Pakistan and many people have the impression that all Americans are Christians.

2007-05-14 14:01:52 · answer #4 · answered by E.T.01 5 · 0 0

i assumed that a average Christian became into somebody who became into Christian yet held political ideals that would average. as an occasion, a average Christian quite of being professional-existence (extra Christian precise), must be professional-decision or professional-choiceANDpro-existence. I continuously have chanced on it traumatic whilst political stances have been conditional on being stated as a undeniable faith, in view that that would genuinely be bastardized/corrupted right into a "believe me, or you're banished from God's favour" that's an apprehension-based leverage device that i don't think of accurately reflects Who God Is, or how God works.

2016-11-03 22:16:21 · answer #5 · answered by ? 4 · 0 0

American implies that we have the freedom to worship as we please.
God Bless

2007-05-14 11:34:32 · answer #6 · answered by Anonymous · 1 0

Only if you were born BEFORE 1960 or thereafter!! We KNOW that the only reason every heathen, athiest, Christian either Catholic or Protestant or Muslim wants to be in America is because of the Christian principles we were founded upon and the whole world swims, floats, crawls, walks, rides and does ANYTHING NECESSARY to get within our borders and then lies, cheats and hides because of this Christian, praying, caring, loving nation!!
You don't see any of you folks, or those from other nations doing whatever it takes to get to ANY OTHER NATION to be under Communism/Atheism/Buddhism/Shintoism/etc. or under totalitarism or dictatorships under Sheria (?) law!
Get a life and get some reality going on! NO ONE came here to be free FROM God, they run here FOR all the blessings He has shed upon us for holding Him sacred and King of Kings and Lord of Lords.
The younger generations don't even know the true history of this wonderful nation of liberty. They are insulent, rebellious and disrespectful and THEY NEVER PAID THE PRICE FOR ANYTHING EXCEPT THEIR LATEST ELECTRONIC EQUIPMENT.
American implies Christian to me! I was born before "freedom" meant: "I can do anything I choose and to hell with any principles or codes of conduct that I don't like!!
You know, there is still a universal law that remains as constant and real as the law of gravity... "whatever a man sows; that is what he will reap", believer or not!

2007-05-14 11:34:17 · answer #7 · answered by Anonymous · 0 1

Of course not. It simply means being a citizen of the US. "American" doesn't imply religion, race, political party, sexual preferance, educational level, or any specific moral values.

2007-05-14 16:18:27 · answer #8 · answered by Witchy 7 · 0 0

No, it rather implies aggressive and merciless acquisitiveness under a thin veneer of Christianity.

2007-05-14 15:19:53 · answer #9 · answered by Anonymous · 0 0

Hmm, that's a toughy. America was founded so we could be free from religious persecution, yet here we are almost 250 years later watching Christians, as the predominant religion in the land, led by our dubiously elected President, impose their beliefs on us. Gay marriage? Abortion? Your fancy storybook says that stuff's not ok, so the rest of us have to abide by it. I can't swear allegiance to my own country without declaring myself, as a citizen of this nation, "under god." Hell, I can't even spend my own money without being reminded that I'm supposed to resign myself to trust in god. (hooray for my debit card!)

2007-05-14 10:30:01 · answer #10 · answered by scottychop 2 · 3 0

fedest.com, questions and answers