English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Whats the one thing most major religions have in the common? The Bible. Each religion has applied it diffrently, but they all use it. One of the main messages is to have Faith. Could it be, that what is important is just that we have Faith, and the rest of it will just fall into place?

2006-11-17 07:31:23 · 17 answers · asked by sweetie_baby 6 in Society & Culture Religion & Spirituality

Im not confused. I said MAJOR religions. Judiasm, torah, old testament of the bible. Catholics, Cathiloc Bible, is the bible, with additiona books. Mormons, Bible, with an additional book. Baptist, Bible. MAJOR religions. Last I checked, those were the major religions followed.

2006-11-17 07:41:06 · update #1

17 answers

No, faith is not enough. James 2:17 says "thus, too, faith, if it does not have works, is dead in itself. "

2006-11-17 07:48:52 · answer #1 · answered by pachequito 2 · 2 0

Islam is a major religion that doesn't use the Bible, it uses the Koran. Some religions claim to use only the Bible but actually rely more on other writings that are uninspired and doctrinally in contradiction to the Bible. IE: Mormons use the Book of Mormon etc. Christians are the only ones who stick only with the entire Bible, that which is known to be inspired by God Himself. Other major religions-and true Christianity isn't a religion- either use an entirely different book and completely omit the Bible or they claim to believe the Bible but rely more on their uninspired writings or they don't use the entire Bible -Old and New Testament.Some religions teach their members that they must have another book in order to understand and interpret what the Bible says. True Christians use only the inspired Word of God as recorded in the Bible-minus the obviously uninspired Apocrypha because it clearly contains contradictions to the teachings of the rest of Scripture.

2006-11-17 07:45:01 · answer #2 · answered by utuseclocal483 5 · 0 1

You really need to think more closely about that. Firstly, most major religions (about %54 of the world's population) do use the bible in some way- but the other %46 dont! There's a lot of religion in this world, and your bible is losing- not gaining influence.

Secondly, the bible itself does not teach tolerance as you suggest. Religious intolerance is a staple of the Abrahamic religions- jew vs gentile, christian vs jew, muslim vs christian, etc.

Faith is important, but faith without accurate knowledge and righteous works is useless, or so teaches the bible. It also teaches that destruction is the end of all who oppose it.

I support your philsophy of faith and tolerance, but your religion does not. Do not merely be tolerant of others then, but apply yourself zealously to eliminating intolerance wherever it may be found, even in your own belief system, holy books, and churches.

You seem peaceful and enlightened. You have a responsibility.

2006-11-17 07:46:01 · answer #3 · answered by B SIDE 6 · 0 1

Actually the Bible is not the thing that most major religions have in common. The Bible is the thing that most Christian sects have in common but some have their own versions of it.

2006-11-17 07:34:06 · answer #4 · answered by Anonymous · 4 1

Yeah, you're confused. The three Abrahamic religions are religions of the Book (meaning they apply Mosaic law, albeit interpreted differently). But Hinduism doesn't use the Bible, Buddhism doesn't, Jainism doesn't, neither does Zoroastrianism or Wicca or Satanism or Confucianism or Shinto or Sikhism or Taoism or Vodun or any of the Native American faiths...

2006-11-17 07:37:06 · answer #5 · answered by N 6 · 4 1

For Christianity, just belief and faith is not enough. It is even says in the New Testament that even the demons have faith that there is a God. Jesus Christ doesn't want just faith, he wants all of you! Christianity is not a religion, it is a relationship. That is what makes it so unique in contrast to other religions of the world.

2006-11-17 07:35:29 · answer #6 · answered by j_2_the_friggin_l 1 · 1 1

The important thing is your personal relationship with Jesus.

People often misplace their faith by putting it in something or someone other than Jesus.

Hebrews 11:1 defines "faith" but you must read the entire Bible to understand how to use it. And if you have any questions, talk to the author Himself.

2006-11-17 07:43:09 · answer #7 · answered by Anonymous · 1 1

i always wondered. why do christians proclaim theirs as the only truth. that it THE truth, and nothing but the TRUTH. when their whole system is based on faith. why would u need faith if it was the truth. why cant christians come to terms with this, and realize that they are using faith not fact in their religion. i dont have faith that 2 + 2 is 4. that is a fact.

2006-11-17 07:35:42 · answer #8 · answered by Anonymous · 2 0

That would be a blind faith which many do have. If Satan is the great deceiver who he most likely be able to deceive? True faith is a faith in truth. It is knowing the truth about Jesus, His work of salvation and knowing that it is for me not just someone. True faith is knowing Jesus yourself and knowing that He died for you and that you are righteous before God in Him. You can't earn it or add to it. Most "faiths" that differ you will find are loosely based on The Bible and mostly on the whims of men.

2006-11-17 07:38:21 · answer #9 · answered by beek 7 · 2 1

Now Faith is the substance of things hoped for, for the things not seen. Hebrews 11-1. King James Version. That should pretty well answer your question.

2006-11-17 07:45:48 · answer #10 · answered by john h 3 · 1 1

It's only the christian denominations that have the bible in common.
Many many more people in the world don't adhere to the teachings of the bible at all.

2006-11-17 07:36:27 · answer #11 · answered by DontPanic 7 · 2 1

fedest.com, questions and answers