English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If we made robots that were human like and uploaded all known data how would they decide their morals? Every movie I see has directive 1,2,3....why do the ten commandments seem so strange to some people then? Is it not logical that a created being is subject to directives?

2006-12-28 11:56:13 · 8 answers · asked by Pilgrim 4 in Society & Culture Religion & Spirituality

8 answers

A robot's moral capacity would be limited to their ability to deviate from their programming. A robot can commit an 'immoral' act but only from an outsider's perspective, because they are limited by the commands and input/data they have received.

Just as children are exempt from being 'sinful' until they reach the age of reason/recognition, so to would a robot (because they do not evolve) be exempt from being considered immoral.

Even adaptability programs would be written according to rules, negating the aspect of free will that humans enjoy. And if a human creator did create a 'sandbox/free will' program, the creator would still be held liable because they could have programmed the robot NOT to allow for such an act to be committed.

2006-12-28 12:05:56 · answer #1 · answered by Khnopff71 7 · 1 0

The morals would be in the code. Certain decisions would be made based on certain criteria being met or not met. These decisions would be based on the programmer's morals. For example the robot could be coded to kill gays or to disregard homosexuality.

For a robot to have these so-called morals and somewhat function as a human it has to have a high level of logic. Robots get their logic from the programming code. This would be a very very advanced robot. I don't think they exist yet but they will. Maybe I will even help build one. Robotics are fascinating.

2006-12-28 20:02:46 · answer #2 · answered by ÜFÖ 5 · 1 0

Data would be meaningless without instruction and logic. GIGO. If one could load in a mutually acceptable set of instructions, then you could program moral capacity. The problem would be able to decide what is acceptable and more importantly why.

2006-12-28 20:00:41 · answer #3 · answered by Scott K 7 · 0 0

They could act morally - but not have a moral capacity.
That requires a spirit - a heart.
I agree with you that as the created beings we are we need direction and right and wrong....

2006-12-28 20:00:51 · answer #4 · answered by Anonymous · 0 0

yes...robots can have moral capacity. just look at all the moral people out there who follow man's rules and not their conscience for making moral decisions...sounds robotic to me.

2006-12-28 19:58:47 · answer #5 · answered by Anonymous · 1 1

Yes it sure can!!
Man has only to program it into the robot
set all the moral values into it
tell it to be faithful to one woman etc
not to kill etc
steal
lie
cheat
etc
etc

2006-12-28 20:13:25 · answer #6 · answered by ausblue 7 · 0 0

The answer is no because to be able to have morals you have to be alive. Man cannot make life.

2006-12-28 20:05:10 · answer #7 · answered by saintrose 6 · 0 0

The brain of the flies are so far ahead than robot.....

2006-12-28 20:00:54 · answer #8 · answered by Harvard 4 · 1 0

fedest.com, questions and answers