It's been dominant in Europe and the Western Hemisphere for centuries so I will say NO...
2007-08-24 16:49:17
·
answer #1
·
answered by Brian 7
·
3⤊
2⤋
How does the South figure in? Even if it's a reference to the French Catholics, they are a Christ-centered religion also.
Jeopardy had a question this week regarding Amercian history, and who said that there should be a Bible and a newspaper in every house. It was Benjamin Franklin. Several quotes of founding fathers regarding the Bible, God, and religion are listed here:
http://www.constitutional.net/qff.html
"Of all the dispositions and habits which lead to political prosperity, religion and morality are indispensable supports.
It is impossible to rightly govern the world without God and the Bible."
- George Washington
And here is the whole quote from Ben Franklin:
"A Bible and a newspaper in every house, a good school in every district- all studied and appreciated as they merit -
are the principle support of virtue, morality, and civil liberty."
- Benjamin Franklin; March 1778
2007-08-27 15:41:43
·
answer #2
·
answered by djstocks 2
·
0⤊
0⤋
No, it does not.
The abandonment of Christian values wrecks the society.
See the Roman Empire and the American South.
2007-08-25 01:51:07
·
answer #3
·
answered by ? 6
·
0⤊
0⤋
Societies deteriorate when there are no morals and values left anymore. America was founded on Christianity along with the rest of South and north America.
2007-08-24 18:04:26
·
answer #4
·
answered by - 6
·
0⤊
0⤋
Its no different then any other religion if it dominants your country. Its just a different belief . Religion has its place and government has its place. look at the middle east religion has kept them in clothing from 2000 years ago, The only thing that grew over their was the oil the Europeans help them get out of the ground. And since Abraham the hate for the Jews it got worst after WWII. Once religion walk in over there common sense left.
2007-08-24 16:48:05
·
answer #5
·
answered by margie s 4
·
3⤊
1⤋
i might ought to declare your the only with the twisted perspectives. To think of that growing to be christian faith is making the english speaking international circulate backwards. hmm i might decide to pay attention the place you obtain your information to help this declare. i might say that faith in christianity strikes people forward no longer backwards. i might plenty quite stay in a rustic full of chritianity than muslim or monkeyism. Christians are very type hearted people. Christians attain out to the international extra advantageous than the different faith. and Germany has something to be happy with. Wow a homosexual flesh presser. they're now far extra stepped forward than everybody else. SIKE! To me they're those shifting backwards. somebody needs to instruct those people intercourse coaching to tell them that 2 pencil sticks do no longer circulate mutually. lady have a mouth and butt too. there's a reason a guy and a guy won't have the capacity to get pregnant. via fact THEY replaced into no longer DESIGNED TO BE mutually.
2016-10-16 22:13:13
·
answer #6
·
answered by ? 4
·
0⤊
0⤋
Well,it can be a detriment,but it has also been a force for change and growth.Many modern scientific disciplines had their foundation in the Church,and many modern rights are the direct result of the struggle of religious people.
That being said, what I can only call the prostitution of Christianity in American society for the purposes of political power has the potential to end democracy here.
2007-08-24 16:45:52
·
answer #7
·
answered by ? 4
·
3⤊
3⤋
No, Liberalism and Socialism have already done irrepairable damage to the entire country, look at the genocide of the unborn supported by the pro-"choice" crowd (Democrats).
2007-08-24 17:13:27
·
answer #8
·
answered by Jeremiah Johnson 7 7
·
1⤊
1⤋
it has nothing to do with christianity specifically.
it's all religion.
a far more pressing example for christianity is the republic of/northern ireland. christianity ruined that island.
islam has ruined africa and the middle east. it has left them 1400 years behind the western world.
zen buddhism kept japan in civil war for years.
hinduism actually advocates people killing one another. the belief in reincarnation actually justifies killing people--their souls will live on.
religion, in all its poisonous forms, wrecks society.
2007-08-24 16:45:57
·
answer #9
·
answered by brian 4
·
4⤊
2⤋
yes...i think it's because they can't be content with being christians themselves but think they have the right to force their religion on other people who already have their own belief systems! this has happened, to the detriment of, native peoples all over the world...part of the arrogance of christianity that really bothers me!
i would say that in most cases they have at least wrecked the native cultures and in the more distant past they murdered people who refused to convert, burned books of knowledge....the list goes on but i think this short but awful one is quite enough to make my point.
in my never to be humble opinion
2007-08-24 16:48:50
·
answer #10
·
answered by Anonymous
·
3⤊
3⤋