When I was small and taught of religion, it taught me that I shouldn't fear the unknown. That there is more to Heaven and Earth than wordly knowledge and such.
The greatest unknown was of course, God.
It is human nature that we fear the unknown, but I don't know, God made me feel like I shouldn't fear the unknown, instead find out about it.
That's how religion made me felt when I was small.
But nowdays, I see EVERYONE fearing things they don't understand, things that are unknown to them. Religion should give us hope, respect and such, since when did people twist it enough to turn it into fear, racism and hatred?
2007-03-28
18:55:05
·
9 answers
·
asked by
Adia Azrael
4
in
Religion & Spirituality