If you are a Christian, are your beliefs based on what the Bible says or what your religious teacher says? Can you locate in the Bible the scriptures that back up what you believe? Or, do you just believe certain things about your religion because that's what you've been told is true? Is it sort of mixed? That is, are there some things you know from what the Bible says and others you just accept without questioning, without knowing the source? Can you really be said to have true faith, or are you really a Christian, if you don't know the scriptural backing to your beliefs? What would you do if you investigated certain teachings at your church and found that the Bible contradicts those teachings? What would you do if you investigated certain teachings and found that some other religion taught the truth from the Bible? Are family ties, history and tradition more important in choosing a religion than Bible truth?
2007-02-10
17:15:52
·
16 answers
·
asked by
quarky2233
2
in
Religion & Spirituality