Sadly, Christianity has been feminized by American culture. In reality, Christians throughout history have been warriors and leaders. Somehow once Christians reached the new world, there came this idea that Christians were to be door mats. I believe it has a lot to do with taking certain scriptures out of context. God allowed his men to wage war on the unjust; He brought plagues and took the lives of people who were against him in the old testament. Jesus wasn't a pansy either. It really is just misinterpretation of the Bible, in my opnion, that has led to this idea that Christians are to be cheek turning pansies. You should read "Wild at Heart" by John Eldrigde. Or got to mhbcmi.org and listen to Rob Bell speak. You might also like Donald Miller's writings.
2006-08-15 23:30:52
·
answer #1
·
answered by katethefabulous 3
·
0⤊
2⤋
I am atheist was once a christian.
I have noticed this also. I have often thought that it would seem that male charachteristics are 'vilified' somewhat
EDIT:
some of the answers have pointed out that males are mostly the leaders. But I think this reinforces the whole feminizing view. Male leaders no #1 enemy you could say, is the male population not the female.
2006-08-16 06:28:28
·
answer #2
·
answered by CJunk 4
·
0⤊
0⤋
I do not think that christianity and the christian culture has been feminised.
Most churches I have been associated with have had male leadership and management.
The Catholic church is a prime example with its vast heirarchy of male leaders.
Some other churches (Salvation Army is one) has a higher proportion of female leaders. It though maintains a very strong, practical and gospel based leadership.
One church training institution that I was at for two weeks had a section in its library dedicated to 'feminist' theology, was not sure what to make of that.
Peter
2006-08-16 06:33:39
·
answer #3
·
answered by Peter H 3
·
0⤊
0⤋
All of civilization is being feminized, not just Christianity.
You see this trend everywhere you turn.
It's a womanized world now, and the new feminism isn't a true feminism.
It's more like promoting more crime, while looking at the crime in an alarming way.
"My God, isn't that awful?"
The man has been kicked out of the household.
He don't count for nothing.
Enemies around?
Wonder why?
2006-08-16 06:31:20
·
answer #4
·
answered by Anonymous
·
1⤊
0⤋
I'n not quite sure I follow you. Most of the leadership of mainstream "Christian" fellowships are male; some allow female teachers.
I do agree that most of the work of the church is being done by women and that is sad. God designed it differently and the men need to step back up to the plate.
2006-08-16 06:26:09
·
answer #5
·
answered by steve 4
·
0⤊
0⤋
Society as a whole has been feminized. Especially if you factor in the lack of violence, the focus on cooperation rather than competition, and the allowance of tolerance rather than abject persecution of others. So yes, Christianity has been feminized in that sense, for the better.
2006-08-16 06:25:38
·
answer #6
·
answered by noir 3
·
0⤊
1⤋
I can't help you, I am Catholic and men are the spiritual leaders, the head of households and Christ was a man and our Savior. Don't get me wrong, I am not complaining,I like it just the way it is.
2006-08-16 06:27:02
·
answer #7
·
answered by Debra M. Wishing Peace To All 7
·
0⤊
0⤋
It's modernity and the recognition that men and women are created equal. If you want a religion that oppresses women, try Islam or orthodox Judaism.
2006-08-16 06:24:57
·
answer #8
·
answered by Anonymous
·
0⤊
1⤋
the more things change the more they remain the same....why is it so....because one had foreknowledge and chose to choose for the other...a concession assumes choice so my answer is no...
2006-08-16 06:32:11
·
answer #9
·
answered by john g 1
·
0⤊
0⤋
Yes but it's a good thing
2006-08-16 06:36:20
·
answer #10
·
answered by Princess 4
·
0⤊
0⤋