English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Read carefully as I'm asking for wich is more important.

2006-09-13 18:36:42 · 25 answers · asked by humm 2 in Society & Culture Religion & Spirituality

25 answers

Me!

2006-09-13 18:39:56 · answer #1 · answered by Anonymous · 0 2

The bible is God's word. We are all ultimately trying to reach God. The bible makes this possible.

and...this is such a loaded question. I have to assume this is under the assumption that you dont need the bible as you can have a person relationship with God without it. While this is true to an extent how would people have come to any understanding of God without it. Theres a reason he wrote it ya know.

2006-09-13 18:43:33 · answer #2 · answered by Shane 3 · 1 0

properly, God is a "Documented fact", remarkable. That basically skill you write up some document, like the Bible, and people will settle for that as information, even information? ! then you relatively'll could settle for Vampires because of the fact there are books and flicks approximately them. Bigfoot, the Bermuda Triangle, and basically approximately one hundred different substantial bogus ideals, all have some "documentation". Oddly goofy project approximately those Xians, incredibly. Jesus taught in a single of his Sermons, (the single on the Mount, i've got confidence), that we could consistently no longer make oaths--no longer swear on Jerusalem, or maybe our very own heads. So what do they do interior the felony gadget? for just about 2,000 years they used "swearing on the Bible" because of the fact the attempt for credibility vs. perjury. It took mid-twentieth C. questioning to allow federal US courts to enable "solemn affirmations under penalty of perjury", allotting with the Biblical hypocrisy.

2016-11-07 07:08:23 · answer #3 · answered by ? 4 · 0 0

You do not even have to ask this. The Bible came from God. Without the Bible, the only thing you will miss are his written teachings and knowledge about your relationship with Him.
Without God you're not here too.

2006-09-13 18:54:01 · answer #4 · answered by Rallie Florencio C 7 · 1 0

Both! The Bible is "the Word of God" The Bible defines God's character. If you believe in God, you will want to know what He has to say.

2006-09-13 18:41:37 · answer #5 · answered by twelfntwelf3 4 · 0 1

John 1:14 And the Word was made flesh, and dwelt among us, (and we beheld his glory, the glory as of the only begotten of the Father,) full of grace and truth.

The written word was made flesh.
The Bible came alive in the life of Jesus Christ.

The Bible is important because it's God's Word.
Jesus exemplified the Bible's teachings 100%.

2006-09-13 18:53:46 · answer #6 · answered by Bob L 7 · 1 0

Well if you have to make a choice I would say God, but the bible is God's word for His creation and He had men write it from His inspiration and therefore it is as though God would speak to us through His book.

2006-09-13 19:03:26 · answer #7 · answered by papaofgirlmegan 5 · 0 0

If there was no God there would not be a Bible, so I guess God is more important (ie, more necessary). Is there a point to this question?

2006-09-13 18:40:57 · answer #8 · answered by Anonymous · 1 0

God. The Bible belongs to two religions, Judaism and Christianity.If not in the Bible, God finds other ways to communiucate with us. 'The lovers of God have no religion but God alone.'

2006-09-13 18:42:14 · answer #9 · answered by huztuno 3 · 0 1

God! without God there would be no need for the Bible.

2006-09-13 18:38:45 · answer #10 · answered by BoredomStrikes 3 · 1 0

the bible is God's word, it's truth, without God there is no Bible. Who do you think is going to get people into heaven? You can't seperate God, from His word. The bible is our guide to show us how to get to God.
next time ask something a littlebit harder. :)

2006-09-13 18:42:41 · answer #11 · answered by Anonymous · 1 0

fedest.com, questions and answers