English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.

2006-07-05 13:14:21 · 7 answers · asked by Anonymous in Politics & Government Law & Ethics

7 answers

First, you're talking software and software can always have bugs.

Second, you're assuming that the laws will always be interpreted correctly. While neural-net pattern recognition and rule-based expert systems are fairly adaptable, none of them are close to perfect yet.

Third, you're assuming that the machine will always be able to recognize what constitutes harm to a person, or who is a person. Humans can't always figure that out, and we've been at this for a lot longer than machines.

Especially with the second law, a human can almost always come up with a creative way of causing a machine to do something harmful, without the machine realizing the consequences no matter how smart it is. Humans have been finding ways to do the same thing (see fraud, duress, undue influence, etc) for thousands of years, using other humans.

And on and on. The three laws are a nice idea, but as someone who spent 10 years in the AI field, I can tell you that abstract concepts like that won't work in practice.

See also James P Hogan's Two Faces of Tomorrow.

2006-07-05 13:25:23 · answer #1 · answered by coragryph 7 · 0 0

Well, the thing about Asimov's stories is tha the robots never "broke" the three laws. They only exhibited behavior which Appeared to break them. When enough information is gathered, it becomes clear that the robot acted within the three laws.

Actually, what I've just said isn't exactly true. There is a fourth law of robotics - a zeroeth law - that trumps all the others. A robot may break the three rules and even kill a human to obey the zeroeth rule. What is it? Read on, friend. Read on.

2006-07-05 13:33:26 · answer #2 · answered by Loss Leader 5 · 0 0

Order the robot to break the 3 laws.

2006-07-05 13:17:57 · answer #3 · answered by mistresskaida 3 · 0 0

The robots I have seen large enough to hurt a human were in cages that shut the robot down if opened.

2006-07-05 17:22:22 · answer #4 · answered by Anonymous · 0 0

Because the military guys don't think some science fiction writer should dictate how they program their equipment.

Seriously, I saw an interview with a military robotics researcher and he was asked that question. He laughed for 30 seconds befor he answered.

2006-07-05 13:48:00 · answer #5 · answered by JFra472449 6 · 0 0

I'm sure that Isaac Asimov would approve of this. If you're not a robot, you can break the three laws.

2006-07-05 13:26:21 · answer #6 · answered by SPLATT 7 · 0 0

Which one are you reading, Caves of Steel or The Naked Sun? If you haven't, I recommend them highly. Is it in The Naked Sun the robot commits murder? I can't remember how the culprit made him do it now. darn.

2006-07-05 13:19:45 · answer #7 · answered by Anonymous · 0 0

fedest.com, questions and answers