In the 1800's- early 1900's, when american women were fighting for the rights that they have now, the pricipal roadblock was the entirety of the adherents of the christian faiths' view on what the bible said was a women's role in society (submit to your husband, stay quiet in church and in public)
in fact, as i understand, the only real supporter of women in this aspect were the secularists of that era.
for all christian history up until that point, it was the understanding of all of christians that the bible demanded that specific role for women. then, when women could no longer be denied rights like that, it all of sudden became conventional wisdom that the bible didn't really mean that and wasn't supposed to be interpreted that way. I think this is a huge gap in logic to think that christianity can collectively forget what was understood to be the bible view across all christian history. if i am mistaken here, please set me straight. what am i missing?
2007-01-14
13:43:01
·
6 answers
·
asked by
Anonymous
in
Society & Culture
➔ Religion & Spirituality