Ethical trap: robot paralysed by choice of who to save

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

External News Details

CAN we teach a robot to be good? Fascinated by the idea, roboticist Alan Winfield of Bristol Robotics Laboratory in the UK built an ethical trap for a robot – and was stunned by the machine's response… But robots designed for military combat may offer the beginning of a solution. Ronald Arkin (Interactive Computing), a computer scientist at Georgia Institute of Technology in Atlanta, has built a set of algorithms for military robots – dubbed an "ethical governor" – which is meant to help them make smart decisions on the battlefield. He has already tested it in simulated combat, showing that drones with such programming can choose not to shoot, or try to minimise casualties during a battle near an area protected from combat according to the rules of war, like a school or hospital. Arkin says that designing military robots to act more ethically may be low-hanging fruit, as these rules are well known. "The laws of war have been thought about for thousands of years and are encoded in treaties." Unlike human fighters, who can be swayed by emotion and break these rules, automatons would not.

Source: New Scientist

Additional Information

Groups

GVU Center

Categories
No categories were selected.
Keywords
robotics, ron arkin
Status
  • Created By: Joshua Preston
  • Workflow Status: Published
  • Created On: Sep 15, 2014 - 8:55am
  • Last Updated: Oct 7, 2016 - 10:27pm