We're coming close to a new election year and many people are outraged by the way our nation has been running. There is a massive divide between a Christian nation, and everyone else. Many think we need more of God in our government, too many some say. Many think that the founders of our country were right and that religion should have no place in politics.
It's my belief that Christians, moreover, Christian Republicans have set our country on a path of self-destruction. With foreign affairs at their worst and the most unmoral of decisions being made by our government in the name of religion instead of progress, what justifies these actions? Why isn't anyone holding them accountable? Is it because over 70% of Americans are Christian?
Can anyone give a justifiable reason why something as tainted and wrong as American politics should involve religion? Or do you think it is overly religious people like Bush that have tainted our government?
2007-10-21
04:13:44
·
7 answers
·
asked by
computerqfl
3
in
Politics & Government
➔ Government
I should add, this isn't about Christians as a faith or people. It's about Christian extremists such as the extreme evangelists in the bible belt, like where so many of our congress come from. I would ask the same if it were any religion in such hard control over government. This is about our leaders making their decisions based on religious views and not of conscience. I'm hoping for answers from those who support the view of religion being involved in government.
2007-10-21
04:44:33 ·
update #1