View Single Post
Old 06-20-2013, 04:22 PM   #13
GGG
Franchise Player
 
GGG's Avatar
 
Join Date: Aug 2008
Location: California
Exp:
Default

Quote:
Originally Posted by CaptainCrunch View Post
As an addon one of the oldest debates around AI in combination with something like Asimov's rules of robotics.

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Lets say that they finally built the Ed 209 from Robocop, and it came across a hostage situation where a human with a gun was threatening a hostage.


Would the AI be able to rewrite the three rules above to save the hostage from the hostage taker?


Nope, it would either have to retreat from the situation, find a compromise solution or blow its own head off.
Dont forget than even Asimovs robots developed the zeroth law of robotics independant from humans to state that the protection of humanity as a whole was greater than harming one human. This caused one robot to cause the earth to become radioactive and forced humans to colonize space.

Not to mention that a lot of Asimovs robot fiction was based around flaws in the 3 laws of robotics.
GGG is offline   Reply With Quote
The Following User Says Thank You to GGG For This Useful Post: