yes and no..some people say the are christians and give christians a bad name..
2006-10-08 18:07:23
·
answer #1
·
answered by Taylor 3
·
1⤊
1⤋
True Christians who follow Jesus and not the Republicans or Democrats are the ones who make this country a better place. The problem is that too many are following the "Christian" positions in the respective political platforms and forgetting that Jesus doesn't give a rip which party they belong to. Jesus said that all people will know His disciples by their love (John 13:35), not by their political affiliation.
For that matter, I'm working on a project to point out that Jesus is neither a Republican or a Democrat. Check out the first link below.
2006-10-09 03:14:11
·
answer #2
·
answered by Pastor Chad from JesusFreak.com 6
·
1⤊
1⤋
I don't think any one group of people can be given credit for making this country either good or bad.
Right now, Christians are doing their best to shape it in accordance to their beliefs, which I feel is wrong, but that doesn't make them bad people. The liberals and secular folks are trying to keep the country from becoming a theocracy and Christians have a problem with that... but again, it doesn't make them bad people.
If we start saying one group is responsible for making the country better, then that leaves all of the other groups in the "not making it better" column and that's just not accurate.
And for those who state that Christians founded this country, please go back and read your history books because that is not fact... it's fallacy, passed on by all of the churches in this country (specifically those that are evangelical or fundie Christian).
2006-10-09 01:13:23
·
answer #3
·
answered by Rogue Scrapbooker 6
·
1⤊
2⤋
Yes and no.
I can't say that a long time ago those christian pilgrims that came to America and took over Native American land and killed so many of them and forcing the others toChristianity is "better."
All I can say is that without Chrisianity (or any other religion) in this country (or world) wouldn't be the same as it is today.
2006-10-09 01:12:24
·
answer #4
·
answered by Anonymous
·
0⤊
2⤋
I think Christians that emphasize love make anywhere they are a better place. Contrarily, I think Christians that emphasize judgment and morality make things generally worse. Acceptance gets my vote; self-righteousness gets my boot. And Christians have no corner on the market in either of those. However, their community tends to tolerate producing more judgmental people than any other community of which I'm aware.
2006-10-09 01:10:45
·
answer #5
·
answered by NHBaritone 7
·
2⤊
1⤋
Christians sure didn't make this country a better place for the indigenous people, the Native Americans. and I bet most of the animals living on this continent could have done without the hordes of pious white humans who believe that their god gave them the Earth to dominate and control.
2006-10-09 01:18:46
·
answer #6
·
answered by nebtet 6
·
1⤊
2⤋
Sure. Christians believe in the things the founding fathers did.. The sanctity of contract, truth, respect for people and property.
Compare US to other countries without a Christian presence.
2006-10-09 01:12:53
·
answer #7
·
answered by Anonymous
·
1⤊
2⤋
It's not just christians who makes this country better or worse. All of us are responsible for it.
2006-10-09 01:10:45
·
answer #8
·
answered by buttercup 5
·
0⤊
2⤋
Absolutely, when the Christians are raptured this world will become a hell on earth and the wrath of God will begin. I feel that the only reason the world is some what intacts is because the believers are still here.
2006-10-09 01:14:03
·
answer #9
·
answered by ckrug 4
·
1⤊
3⤋
No...good people makes this country a better place. For Christ's sake, those presidents on our money were deists.
2006-10-09 01:08:39
·
answer #10
·
answered by Anonymous
·
3⤊
2⤋