I was under the impression, that the US was a secular society, where religion had no influence in the workings of the country (in politics for example), and no religion was favoured over another (free-religion).
This is supported by the 1st ammendment:
"Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."
So why is the US now a Christian nation?
Where no [successful] politician admits that they are anything other than a Christian?
Where strong religious lobbies have influence and sway over politics and education in schools?
I'm just interested in where/why this change over happened. Can anyone help me?
2007-11-28
23:01:58
·
14 answers
·
asked by
Adam L
5
in
Religion & Spirituality