English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2006-12-13 12:19:43 · 5 answers · asked by ebv_1 1 in Environment

5 answers

hell yes.

2006-12-13 12:31:39 · answer #1 · answered by Anonymous · 0 0

In some ways, yes.
Almost all religions have (or "claim to have" is a better way of putting it) authority to force morals on their members, ask them for money and regulate their behavior. This is supposedly "what God wants". Quite aside from how or why they know this, I cannot right now think of an example where Religion, in and of itself, tremendously helped the human condition (unlike science, medicine, law, computer science). Quite the opposite, it had been responsible for a lot of death and destruction (and lately terrorism). I have no quarrel with people belonging to an organization for people to celebrate their belief in God and similar things others believe, but is that REALLY what most Religions are these days?

2006-12-13 12:35:58 · answer #2 · answered by clueless_nerd 5 · 0 0

The orld would be better of without any religion where the "holy men" tell the people to go out and kill another group of people as is with the Sunni sect

2006-12-13 12:29:06 · answer #3 · answered by Anonymous · 1 0

Most definatley YES. I believe relgions hinder the possibilities of the human mind. I also beleive so much more would be done for humanity and the earth as a whole.

2006-12-13 12:25:44 · answer #4 · answered by matt45lc 2 · 1 0

No because in a way religion units even those that dont believe.

2006-12-13 12:27:28 · answer #5 · answered by lindaa 2 · 0 2

fedest.com, questions and answers