I believe that death is the beginning of a greater form of life, but who really knows what goes on after you die except those who have? There are many different beliefs of heaven, hell, reincarnation, and other things after death, but who says that we'll be living in another form after we die? Maybe when we die, we just die. Our very existence might just be wipped off from the universe ceasing to ever exist. Just becoming nothing. Vanish forever. What do you think about death? Do you think it's the end of life?
2006-07-14
16:41:40
·
34 answers
·
asked by
Theresa T
2
in
Religion & Spirituality