To some people, it seems to be very comforting when their efforts to rely on themselves or other human beings fail.
Religious faith, while in theory, could be a good thing, is all too often used for nefarious causes, even more detrimental because the "followers" convince themselves they are in the right and lose the ability to assess their actions or the actions of others in a reasonable way.
2006-09-18 01:13:39
·
answer #1
·
answered by finaldx 7
·
0⤊
0⤋
I am not religious so would say no. I think leading a good honest life, working hard and being a decent person is the most important thing which to be fair most religions promote, therefore I respect people of all faiths for the beliefs.
Problems only arise when people become too narrow minded in their beliefs and think it wrong that other people may not think the same way as them.
2006-09-18 01:52:32
·
answer #2
·
answered by Chris G 3
·
0⤊
0⤋
Having some sort of faith is important. It doesn't need to be faith in a higher power, just faith in other people to come good in their promises. Not having some sort of faith would lead to totally cynicism. Not attractive at all.
2006-09-18 01:09:46
·
answer #3
·
answered by little_jo_uk 4
·
0⤊
0⤋
Faith in oneself, any other sort of faith ends in travesty, unless it is faith in friends and there is some mutual goal because faith is meant to achieve something, if a person wishes to make an aims or goal come true, he needs faith.
2006-09-18 01:05:51
·
answer #4
·
answered by syelark 3
·
1⤊
0⤋
Having faith is important. However, i dont necessarily believe you need a faith if you're able to give yourself direction.
2006-09-18 01:07:10
·
answer #5
·
answered by djthis 4
·
0⤊
0⤋
No, I don't think it is. I find the people that need faith are scared, vulnerable individuals brainwashed to believe that following religion is the only way to heaven. Load of crap. If you believe in yourself and your own abilities as a human being and treat others as you want to be treated I can't see you going wrong.
2006-09-18 01:22:13
·
answer #6
·
answered by GayAtheist 4
·
0⤊
0⤋
Actually the faith dosen't born with us , but we grow it some ways
by getting close to god and knows him, and by doing good things
then we will feel more faith , it's a feeling
2006-09-18 01:13:58
·
answer #7
·
answered by êü®ò 3
·
0⤊
0⤋
It is very important to have faith in reason which precludes having faith in the imaginary.
2006-09-18 01:06:00
·
answer #8
·
answered by bonzo the tap dancing chimp 7
·
2⤊
0⤋
Having faith is everything. Without faith it is impossible to please God. People who live by their own wits fall apart at every turn. People who live by faith gain eternal life. Heaven is not possible without faith. Faith is hope in the unseen----it is absolute trust in the promises of God.
2006-09-18 01:09:38
·
answer #9
·
answered by Preacher 6
·
0⤊
0⤋
Yes and no.
A faith can help you find direction in life and give you something to hope for.
Personally, to me, (and I hope to stick by this because it isn't easy), truth is far more important where faith is concerned.
2006-09-18 01:07:52
·
answer #10
·
answered by Studier Alpha 3
·
0⤊
0⤋