Yes, the Bible does state that. Good thing that the Bible doesn't actually define a woman's role in life.
A woman defines a woman's role in life... if she follows the Bible, that is her choice.
My personal POV is that a woman's role in life is the same as a man's - to be as happy as possible without hurting others in order to achieve that happiness.
2007-09-22 02:45:45
·
answer #1
·
answered by Snark 7
·
2⤊
3⤋
Yes, the Bible does state that a woman is not to lead or teach in the church over men. Christ is the head of the church, and men are the head of women. They are to lead their family spiritually, and protect them spiritually as well. A married womens primary ministry must be her husband and family. In our house, it has a slightly different tone as my husband became disabled a little over two years ago by losing a leg to diabetes, and now the other foot is nearly un-usable. Thus, I work outside the home, and helps around the house as he can. We are careful, however, that his headship is not altered.
I know that many here will completely disagree with that. It is against everything the world believes. Think about this though. When women were women, and in a more traditional Godly role, there was a much lower divorce rate, our schools were tops, there were less obese children, crime rates were lower, and so on. The more we stray from what God intended, and turn from Him, the more we break up our families, communities, etc.
2007-09-22 02:45:20
·
answer #2
·
answered by lovinghelpertojoe 3
·
3⤊
2⤋
Kinda strange their 'god' gave the uterus to the evil doers, eh? ; )
Since females are the teachers of children from birth by nature, the bible contradicts the divine which means the bible is in error.
2007-09-22 02:45:39
·
answer #3
·
answered by American Spirit 7
·
2⤊
4⤋