I have heard Christians make the claim that America was founded on Christianity. This is a claim I reject. But even granting that this were true, it raises another important question. Even if we grant this nation is founded on Christianity, should it be? Germany was a Nazi nation in 1933, but is that good? Christians commit a historical fallacy in claiming that since something is historical, it is correct or good. Based on that logic, slavery should be reinstated and only wealthy male landowners should be allowed to vote. Even if this nation were founded on Christianity, which it isn't, I believe that we should "UNFOUND" and employ a separation of church and state. Not just because its in the Constitution in the very first amendment, but because it is the right thing to do.
2007-05-31
15:55:41
·
11 answers
·
asked by
doogsdc
2
in
History