There are several Christians here who have used England as an example and don't see why we can't make Christianity the "official religion" (I suppose besides that whole First Amendment thing) Why? WHY must Christians plaster thier faith on every street corner, in every school, in every part of our lives and then blame the worlds woes on the fact that we all don't want to believe like them? Why is respecting thier fellow man to make thier own choices so hard? Why on earth would we want it to be the official religion? REGARDLESS of what you THINK this country was "founded on" it has EVOLVED ( Oh no the "E" word!!) into a country of melting pot faiths. People rise thier lives to get here in order to have FREEDOM OF RELGION. THAT is what America is based on, why change that?
2007-11-22
11:42:47
·
18 answers
·
asked by
~Heathen Princess~
7
in
Religion & Spirituality