Christianity came to spoil the beauty of a pure and intact world, men and women lived in harmony with nature, and in nature they found many of their gods. Women and men both had their roles, their responsibilities, and they all accomplished them with no problem at all... christians come, and women is pushed into the kitchen with no rights to do anything, which casued some centuries after, the appearance of feminist, who now try to take over the world, taking what they already have and what they deserve, and everything else they want.
2007-03-01 04:42:54
·
answer #1
·
answered by User 4
·
0⤊
1⤋