Christianity is the worst religion in the world, and tries to turn America into 1492 Spain by gaining control of Congress.
2007-03-22 05:46:05
·
answer #1
·
answered by Anonymous
·
3⤊
2⤋
I believe christianity is destroying america. America was founded on the ideal that all people were equal in their rights. Christianity (especially evangelical) is based on the belief that they are above others because they are "saved" and this equates into the destruction of all other beliefs (or when you get right down to it, opinions)
christianity as practice by the hard right is completly oppsed to democracy and the american ideal
2007-03-22 13:05:55
·
answer #2
·
answered by simon 2
·
3⤊
0⤋
"Reality," is what is destroying not just christian doctrines but religion for mankind is not as gullable as it has been since the beginning. It has little to do with "America." Religions will always flourish though for the species is a ball of confusion which seeks everything delusional to fill it's empty time with and they will always "fight" and "kill" to remain PART of the human NEED to comfort themselves in an unpredictable world.
2007-03-22 12:52:14
·
answer #3
·
answered by Theban 5
·
2⤊
0⤋
Yes and yes.
Christianity as it is practiced in this country is leading us down pathways which are evil. It is spreading hate and discrimination. It has sent thousands of Americans to die fighting muslims in Iraq. It has done no good over the past half century.
On the other hand, America is destroying what Jesus intended christianity to be. Rather than posting the ten commandments in public, Jesus would have posted the beatitudes. Fat chance of the fat cat christians advocating that.
2007-03-22 12:44:14
·
answer #4
·
answered by Dave P 7
·
2⤊
2⤋
I don't see any signs of either. Christianity has been with us from the very beginning, it didn't destroy the country.
Also, there are no signs that the country is making any effort to destroy Christianity.
I am a non-believer, but I see no reason that either should make any effort to destroy the other.
2007-03-22 12:56:50
·
answer #5
·
answered by Anonymous
·
1⤊
2⤋
America is "allowing" those who want to get a foothold on "trying" to destroy Christianity.
Hey, but God knew that that would be attempted too:
1Pe 1:25 But the word of the Lord endureth for ever. And this is the word which by the gospel is preached unto you.
God's word will not EVER go away.
2007-03-22 13:00:46
·
answer #6
·
answered by ViolationsRus 4
·
0⤊
3⤋
Both. Evangelism is a very low form of Christianity far beneath the glories of Catholic Europe. America was founded as a secular Republic.
Yeah, they're both sunk.
2007-03-22 12:46:37
·
answer #7
·
answered by Anonymous
·
1⤊
2⤋
Christianity is trying to take over America.... that is how it destroys. You just have to have the balls to say it out loud.
2007-03-22 12:45:56
·
answer #8
·
answered by Anonymous
·
3⤊
1⤋
Christianity isn't destroying America.Christianity is just brining down those lies of satan.So we're against things like abortion,murder,gay marriage,sleeping with whomever you want,lieing,stealing and all that.This world would be a better place if those things didn't exist.
Jesus said in matthew 10:34,Think not that i am come to send peace on earth:I came not to send peace but a sword.
John 3:19,And this is the condemnation,that light has come into the world,And men loved darkness rather than light,because their deeds were evil.
2007-03-22 13:44:27
·
answer #9
·
answered by Demon slayer 3
·
0⤊
5⤋
Neither. Power-hungry people are the ones whose destroying both.
2007-03-22 12:56:42
·
answer #10
·
answered by coco_loco 3
·
2⤊
0⤋