English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

Christianity came to spoil the beauty of a pure and intact world, men and women lived in harmony with nature, and in nature they found many of their gods. Women and men both had their roles, their responsibilities, and they all accomplished them with no problem at all... christians come, and women is pushed into the kitchen with no rights to do anything, which casued some centuries after, the appearance of feminist, who now try to take over the world, taking what they already have and what they deserve, and everything else they want.

2007-03-01 04:42:54 · answer #1 · answered by User 4 · 0 1

In my opinion it invalidated their beliefs in there own God(s).

2007-03-01 02:23:30 · answer #2 · answered by Anonymous · 2 0

fedest.com, questions and answers