A friend of mine asked this a long time ago, and I found the answers very interesting, so I am asking again with the hope of getting answers from different people.
Were you raised in any particular religion? What point in your life did you realize it was all a lie? How did it make you feel?
I was raised LDS, and from the get-go I would ask questions in Sunday school about where dinosaurs came from, who made God, and other similar questions that were never answered to my satisfaction. I was skeptical for my entire childhood. Once I got into college, I started to research my faith, and found areas that were completely false and even disturbing. As I researched other religions, I found that they contained vast contradictions, and hateful behaviors. The day I realzed God did not exist, I felt sick inside. I clung to the idea that God might be real, and this was completely lost. I feel liberated now, and I love seeing things for what they really are.
How about you?
2007-08-27
04:52:00
·
37 answers
·
asked by
Anonymous
in
Religion & Spirituality