the christians came in and destroyed it say there fairy tail is the right and all outhers are false gods thats what happened then thay brianwashed the countrys
2006-11-05 22:46:07
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Yes. It was Paganism (which is cultural/folk religions). When Christianity came out in full force, it drove a lot of it underground or destroyed it completely (especially our writings in Rome, Egypt, etc). The Church called it demon worship, which of course made many turn from it. The rest is considered myth and/or superstition or old wives tales.
Surprisingly though, a lot of it has survived thru the Christian religion itself thru holidays.
2006-11-06 10:42:58
·
answer #2
·
answered by riverstorm13 3
·
0⤊
0⤋
The same thing that's happened since man began....they all ended up under some sort of opression or were simply whiped out. The western culture was built on a christian (Bible based) foundation and because of that, has flourished above any other...so you ask youself, have they all come to naught?
2006-11-06 07:30:26
·
answer #3
·
answered by r-u-t-1-i-need-2-talk-2 1
·
0⤊
0⤋
Many of them still do. I'll point out Shintoism in Japan, Hinduism in India (and the branch-off religion of Buddhism, widely practiced both in and around India as well), Native American beliefs, the Mayans (who are still around, and practice a form of their own religion mixed in with Catholicism, much like another example in Santeria), and Australian aborigines and their beliefs, to name a few.
2006-11-06 06:46:24
·
answer #4
·
answered by angk 6
·
0⤊
0⤋
yes they had and used to live in harmony some gruop of ppl start to organize church groups and start controlling and impose fear of hell and devils
2006-11-06 07:51:30
·
answer #5
·
answered by george p 7
·
0⤊
0⤋