English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If we made robots that were human like and uploaded all known data how would they decide their morals? Every movie I see has directive 1,2,3....why do the ten commandments seem so strange to some people then? Is it not logical that a created being is subject to directives?

2006-12-28 17:35:06 · 20 answers · asked by Pilgrim 4 in Society & Culture Religion & Spirituality

I asked this question again because there were not many online before. As a software engineer this question intrigues me because giving a robot learning ability and free will would be dangerous yet God did that with us.

2006-12-28 17:50:46 · update #1

Mark T
You are getting at my point...
"computer just decides that humans are irrelevant " are not people doing that to God?

2006-12-28 18:10:11 · update #2

20 answers

Let's put it this way,

While the commandments have served some of mankind well , they have been the basis for many wars and misunderstandings.

(For instance exactly how many commandments are there? 6,7,10 or more - that all depends on who you talk to, code of Adam, Noah and the Decalogue.)

However some basis of rules will likely be applied.

Perhaps we program the machines with rules whereby they cannot do things we don't want them to, This was best described by Isaac Asimov in his Three Laws of Robotics.

However, If we don't do that, we could likely end up in a situation like Colossus - where the machine simply acts logically working for the "betterment of all mankind", however the cost is that we
give up our freedom as a race/species.

The computer makes a very interesting speech at one point during the movie.

Worse would be a situation like The Matrix or Terminator where the computer just decides that humans are irrelevant or un-necessary and/or dangerous to itself and the computer moves to eliminate us.

2006-12-28 18:04:22 · answer #1 · answered by Mark T 7 · 1 0

1) Yes of course you can program a robot to have moral capacity.
This already exists. For instance browsers that do not let you go to sites with dirty words/pics on it.

2) If you only upload data ito a computer it will never evener do sometginh usefull with it. It is the program that does the things.

3) I dont follow you here What is the relation between movies with commends in it and why doe people think they are strange.
If people see some commercial on tv they ususally find it strange as well ?

4) It depends what you you call a created being.
A Flower is certainly not subject to moral directives. But it is subject to directives, for instance it will only growe when there is water.

It is better to pose ONE question at a time

2006-12-28 17:42:06 · answer #2 · answered by gjmb1960 7 · 0 0

I agree it is logical that everyone is subjected to directives in many capacities. It is up to the human being, however, on what directives to follow, whereas the robot is unable to make that choice. Possibly moral capacity, but I seriously doubt it would matter if they don't have a soul and the freedom of choice.

2006-12-28 17:47:32 · answer #3 · answered by braleygirl 3 · 0 0

Um, have you watched Terminator lately? Not all sci-fi robots follow Asimov's Three Laws of Robotics (four if you include the zeroth law -- A robot may not take action that harms humanity as a whole, or, may not fail to act to stop an action that harms humanity as a whole).

2006-12-28 17:43:49 · answer #4 · answered by Anonymous · 0 0

A robot with human capacity means that its capable of learning and have emotions. They'd have morals because THEY have to live in this world too.

Isn't the world a nicer place to live in where everyone isn't trying to hurt each other all the time? Yes, I thought so too.

So, do you have the brains of a pea where you have to have someone tell you that? Or are you intelligent enough to learn it on your own?

2006-12-28 17:40:26 · answer #5 · answered by Anonymous · 0 0

I answered that question before & yes man could program moral values into the robots such as telling it to be loyal to one love
not to murder
or steal
etc
but it would never feel like a human does because it has no heart
or brain

2006-12-28 17:44:42 · answer #6 · answered by ausblue 7 · 0 0

God created us in his own image, we ate from gods tree of knowledge it is conceivable the eventually we as mankind could get to a level where we could create a new life form,the 1,2,3, thing is to keep them under control,as for the 10 commandments they are a test of god no one but jesus has ever followed all of them,so nothing that man created could either

2006-12-28 17:39:51 · answer #7 · answered by Anonymous · 0 0

Robots could probably have moral capacity. The problem with your thinking is that you think morality comes from religion, when infact it comes from biology.

In software terms, morality is in the firmware, not the software.

2006-12-28 17:36:53 · answer #8 · answered by STFU Dude 6 · 0 0

Hmmmm perhaps ya like Data from Star Trek lol

2006-12-28 17:37:35 · answer #9 · answered by ? 4 · 0 0

if science start on Morality factor ,then one day a virus attack on Robots central Processing Unit very soon we'll be ruled by Robots.

2006-12-28 19:04:49 · answer #10 · answered by sultan 4 · 0 0

fedest.com, questions and answers