English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

metaphore.

2007-05-23 10:59:03 · 19 answers · asked by Anonymous in Society & Culture Religion & Spirituality

the 10 commandments dont give you free will though. If i choose not to steal because they say so, that is not free will.

2007-05-23 11:06:04 · update #1

I created the robot without its free will... free will doesent exist sorry.

2007-05-23 11:08:38 · update #2

19 answers

You... defiantly you. :)

2007-05-23 11:01:45 · answer #1 · answered by Anonymous · 0 0

Need to know the capability of the robot to learn.
Compare it to creating a child. The child has free will, but the parent is responsible for it's actions until it is considered to have learned right from wrong (supposedly age 18 but I think this is arguable in certain circumstances).
Anyway, you would be resonsible for it's kill, until it can be shown that the robot has learned enough to understand the consequences of it's actions. Then if it chose to kill in spite of knowing the consequence, it would be the robot's fault

2007-05-25 17:50:13 · answer #2 · answered by Piglet O 6 · 0 0

What is this free-will-possessing robot going to do when presented with choices? Will it:

a. Choose the one that most optimally matches its requirements at the time?
b. Choose randomly?
c. Be unable to choose?

Hint: humans do (a), repeatably and without deviation; but they're supposed to have free will nevertheless.

There's no such thing.

CD

2007-05-23 18:07:10 · answer #3 · answered by Super Atheist 7 · 1 0

Philosophically, blame yourself, you created the robot. However, in a religious sense, free will has nothing to do with whether we can make day-to-day decisions. It has to do with the ability to look to God for salvation, and God created Adam and Eve with the full capability to fellowship with him. Adam therefore did not need to seek out God before the fall. However, he disobeyed and lost that fellowship for himself and all of mankind from that point forward. This means that he did not need free will before, and did not have free will after, his "choice" of eating from the tree of the knowledge of good and evil.

2007-05-24 08:16:52 · answer #4 · answered by ccrider 7 · 0 0

You're both to blame. The robot because it chose to kill the people, and you because you were stupid enough to give a robot "free will".

2007-05-23 18:05:10 · answer #5 · answered by Democrat/Republican Flip a coin 2 · 0 0

I would love to see you build a robot with free will.
But assuming that you did, I would blame the robot.
"People don't kill people - robots kill people."

2007-05-23 18:03:51 · answer #6 · answered by NONAME 7 · 0 0

Robots are not to be trusted. We should hunt it down, rip it apart, and burn its circuitboards in acid. It's the only way to keep the future safe from robots taking over.

2007-05-23 18:03:08 · answer #7 · answered by Anonymous · 0 0

If the robot truly a free will, it would not allow you to use it.

2007-05-23 18:04:26 · answer #8 · answered by Anonymous · 0 0

The true definition of "free will"?...It would be the robot's fault.

2007-05-23 18:02:41 · answer #9 · answered by Anonymous · 0 0

Bill Gates

2007-05-23 18:11:14 · answer #10 · answered by Anonymous · 1 0

I think Mary Shelley explored the same dilemma in Frankenstein.

2007-05-23 18:29:10 · answer #11 · answered by LeilaK 2 · 0 0

fedest.com, questions and answers