90% of Americans say they believe in a God. The vast majority say God is very important in their lives. Yet, so many don't go to church.
I know some might be jaded because of the scandals in the Catholic church, the apparent narrow-mindedness of the extreme religious right, etc. But, what about mainstream churches that teach tolerance and love and welcome others with open arms?
I'm Christian, but I'm not saying others must be Christian. Attend the church, synagogue, or temple of your choice.
Please don't give sarcastic, bitter, nasty replies. I'm looking for real answers. I want to understand it.
2006-12-28
09:45:56
·
14 answers
·
asked by
bluecrabfan
1
in
Religion & Spirituality