Let me ask a different kind of question here. There isn't a doubt in my mind that so many Christians along with other faiths can be hateful, mean, and vicious when it comes to their beliefs and religion, but why is the blame always shifted towards God? It is taught in all religions that He is merciful, all-loving, and created each and every one of us with His love, and why should he be blamed for all the sick extremist banner carriers out there who claim God hates certain people and only saves individuals under their beliefs only (ala Jehovah's Witnesses, some Evangelicals and extreme Christian groups..)? It is them who make God look bad, God didn't personally come down and dis everybody, so why shift the hate towards Him when it isn't His fault for what some HUMANS do?
2007-04-25
20:15:14
·
19 answers
·
asked by
Dusk
6
in
Religion & Spirituality