Somehow the world believes that the word Christian is this narrow definition of what the fundamentalists who call themselves Christians have made the world believe it is.
politicially, many believe that to be Christian you have to be a Bush-loving, red-state republican
Socially, you have to be anti-abortion, anti-gay, pro-gun, and intolerant of others beliefs
Biblically, you have to be a literalist
How can we take back our faith, our God and our country before the fundamentalists destroy this world and turns everyone away from God with their rhetoric? Will they start a holy war right here in America with their extremist beliefs?
2007-04-04
06:50:16
·
23 answers
·
asked by
James M
1