It will happen, mark my words, it will happen. From the time when literacy rates started to increase and people actually began to read the bible instead of having it read to them, the realization of the bibles absurdity has been steadily growing. As it stands now, people with a too literal interpretation of the bible are shunned and avoided even by those who call themselves average Christians. These same Christians are teaching their children the dangers of being fundamentalist in their approach to religion.
The bible is what made Christianity what it is. Some 2000 years ago when Christianity was in competition with older religious beliefs, the fact that this new religion had the backing of a written text unlike the older traditions which were kept alive by a largely oral basis meant that Christianity had a much easier inception. So the bible gave Christianity its power to influence, without the bible, what happens to Christianity?
2007-11-24
10:39:53
·
20 answers
·
asked by
Anonymous
in
Religion & Spirituality