English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I would like to know your views on why religion must play such a large role in everything that we do? Why rules that only exist to a certain group of people should be applied to everyday decisions?

And most importantly, why must the christians always mention "god" is the reason for everything when those who think differently choose not to because it might offend those with different beliefs?
(This is a legitimate question, not ment to offend or brew hatred.
Just a few questions I know many would like to know the answers to.)

2007-01-24 15:44:18 · 17 answers · asked by Hex 5 in Society & Culture Religion & Spirituality

To those who see themselves liberated from the structures of religion, my hat is off you to you.

The truth is, religion Does play a big part in our
everyday lives. Christians are told to "pure the
evil" and as some who understand this, such
as the ways the other religions do. Many take
it upon themselves to empower their actions
with crimes of hate. Offending and hurting those
who won't comply to their criteria.

To those who are with your religion, and to those
who are against any and all religion consider me
your middle ground. And keep in mind that the
"God" that Christians speak of is of the same
importance of anyone elses' and that our Nation
as a whole has become inpart by the favor of
the Christian community. I believe in equality
and freedom; the two main standards that were
agreed upon when the US was first established.

Hopefully you all will agree when I say that world
peace cannot come from one power above all
others, but all others as one power.

2007-01-24 16:30:51 · update #1

17 answers

as a doctor i have to deal with religion every day and watch people deal with doubt when their god doesn't save them. It seems like a cruel joke to give people such false hope that abandons them in the end.

2007-01-24 15:51:26 · answer #1 · answered by Dr. Brooke 6 · 4 1

Void, IMO, religion has established societal mores for centuries because the basic tenets of faith have proved to be in the best interest of the human experience. There are a lot of crackpots out there and some who really just don't get it, but I think it is hard to debate the positive impact religious values have had on society historically. This statement is not a brush stroke approval on fundamentalist "Christianity" and doesn't imply that the Christians themselves are capable of doing no wrong, but Christian values have played an important role in the prosperity of the world.

2007-01-24 15:57:39 · answer #2 · answered by rndyh77 6 · 1 0

Religion doesn't "have" to play such a large role in everything-that's a generalization. Just because we have some God talk on our currency, and people swear on a Bible, or other Holy book, doesn't mean religion is playing a major part. Religion is a cultural construct, people have always looked to religion to find meaning. I'm not sure what rules of religion you are thinking of, but basic rules concerning how we treat one another come from other places besides religion.

2007-01-24 15:58:52 · answer #3 · answered by keri gee 6 · 1 0

Unfortunately, "religion" and "faith" are so embedded in some peoples lives there is no separating their belief and daily events. Some may even jeopardize their well being because of it.

Personally. I think you generate your own destiny with a little luck. Other views don't offend me and if mine offend others, well, I can't help it. Just don't try to influence me with a god and I won't tell you that you can't have yours.

2007-01-24 16:07:09 · answer #4 · answered by Troubled Troll 4 · 0 0

In my opinion, most Christian rules are just rules of morality and should be followed. I really don't think many of the Christian rules discriminate against anyone, but I'm a teenager we never know what were talking about.
I know as a Christian I'm never trying to offend anyone, I'm just trying to be a loving God improving my relationship with God. I'm sorry if what some Christians do offends you, not all are like that.

2007-01-24 15:51:32 · answer #5 · answered by mrfame1017 3 · 0 2

Because they are an overwhelming majority.

When Islam eventually takes over and claims that the American Indians were Muslims who came here in 700 AD, then you'll have to wear a dress and put a scull cap on and like it or die.

2007-01-24 16:55:32 · answer #6 · answered by Anonymous · 0 0

Because this world is enemy territory, and we have an obligation to sound the alarm, and that's offensive to those who don't want to repent. Those who have a heart for repentance are not offended by being told that God so loved the world that he gave his only begotten son that whosoever believeth in him should not perish but have everlasting life.

2007-01-24 16:08:27 · answer #7 · answered by hisgloryisgreat 6 · 1 0

All religion needs to be exposed as nonsense, and be shown to be pure inventions of man. The world may be more peaceful without all these false beliefs that pit one against another.

2007-01-24 16:41:22 · answer #8 · answered by Anonymous · 1 0

It is not about "religion" - it is about relationship. If you don't believe in God, I may be very concerned about your eternal soul and its ultimate destiny, but I can still have a healthy, peace-loving relationship with you. As a Christian, I apply my relationship with Christ to my everyday decisions because I love Him and what to make choices that please Him - just as any married couple would with their spouse.

2007-01-30 16:46:12 · answer #9 · answered by wd 5 · 0 0

It shouldn't. However, it is used on a regular basis to hide people's actual motivations and ambitions. It is the major cause of wars and fighting, and allows people to blame someone else for their problems.

2007-01-30 20:25:20 · answer #10 · answered by Anonymous · 0 0

fedest.com, questions and answers