John wrote "I wish above all things that you prosper and be in good health....." He was writing God's words. It's God's will for us to be well.
Only good things come from God. We can't blame Him for everything that happens in the world. It IS NOT HIS WILL just because it happens.
The bible says "He is not willing that any should perish but that all should be saved." In other words it is His will that all should be saved. Guess what? This won't happen. Mankind told God to get out of their lives, He did. Don't blame Him when things go wrong.
2006-11-28 00:01:29
·
answer #1
·
answered by jemhasb 7
·
0⤊
0⤋
God's will sometimes does not include a physical healing. A spiritual healing is completely up to the individual, and God is all for that, of course. He's provided many many avenues to help us heal spiritually and return to him, it's up to us to take them.
Healing physically sometimes comes, and sometimes doesn't. But if it doesn't come it's not because he doesn't love you, it's because he knows what's best for us in every circumstance and we just need to put our trust in him that the best things can come from what happens that we cannot prevent.
2006-11-27 23:55:19
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
GOD came down in the form of a man in order to heal (first & foremost )the damaged relationship between God and man.Every other form of healing is based upon that simple truth.
2006-11-28 00:08:52
·
answer #3
·
answered by miky 2
·
1⤊
0⤋
Healing is God's will. God can make you recover from your sickness if He sees from the beginning to the end that is good for you. If God sees to the future that your further stay on earth is not good for you He will let you fall asleep and take you back with Him to heaven. Either way God is blessing you.
2006-11-28 00:31:27
·
answer #4
·
answered by seekfind 6
·
1⤊
0⤋
Oh yes. It's all over the bible and although many don't think healing is for today, they're wrong. By his stripes, we were healed, remember he died to take all that away if we trust and believe him to do so.
2006-11-27 23:57:23
·
answer #5
·
answered by ? 4
·
0⤊
0⤋
No, it is only nature at work. Healing is just a word for growth and resistance to events of negative impact on the body. Saying that "God willed my white blood cells to destroy the invading bacteria and debris which resulted in me being healthy" is just wishful thinking.
2006-11-27 23:58:12
·
answer #6
·
answered by Anonymous
·
0⤊
1⤋
God wants everyone to live, love and be happy. He wants us all to reach full potential. Healing may be Gods will, but it wont happen until he is ready to give it to us and we are ready to receive it.
2006-11-27 23:55:52
·
answer #7
·
answered by pegasis 5
·
0⤊
0⤋
I think that you should heal yourself if you can. Sometimes just praying to get better isn't enough. Life is a precious thing and if you can save yours or improve it, then I would do it. I don't agree with some religions that think you should just let nature take its course and not interfere. If my child could die and needed some blood or a blood transfusion I would surely save her by using someone else's donated blood.
2006-11-27 23:57:35
·
answer #8
·
answered by Shay 3
·
0⤊
1⤋
Yes of course it is...we were all given the power to go out and heal in Jesus' name, however there are VERY few people who actually have that kind of faith, you have to have ultimate faith to be able to do that, and they are a rare breed these days, although I hate to say it.
But certainly, God wants to heal his people.
2006-11-27 23:54:47
·
answer #9
·
answered by Resolution 3
·
1⤊
1⤋
Everything which happens on earth, and in the heavens is God's will.
2006-11-27 23:55:13
·
answer #10
·
answered by WC 7
·
1⤊
0⤋