Many people argue that our founding fathers were "Christian" and therfore our country is a "Christian Nation". (I disagree, many of our founding fathers were FreeMasons & Deists, but I digress)
Our founding fathers were also white males. Does that make America a white male nation ? After all, they presided over a Nation where blacks were enslaved and women and blacks were not allowed to vote. They obviously thought white males were superior to everyone else. So using the "Christian Nation" logic, shouldn't we also say that America is a "white male nation " ?
(I don't think America is a "Christian" nation, or "white nation")
2007-11-10
07:54:28
·
27 answers
·
asked by
queenthesbian
5
in
Religion & Spirituality