*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Ian Bogost, professor at the School of Literature, Media, and Communication (LMC) at Georgia Institute of Technology, wrote The Atlantic, March 30, article, “Enough With the Trolley Problem.” The School of Literature, Media, and Communication is part of the Georgia Tech Ivan Allen College of Liberal Arts.
Excerpt:
Should an autonomous car endanger a driver over a pedestrian? What about an elderly person over a child? If the car can access information about nearby drivers it might collide with, should it use that data to make a decision? The trolley problem has become so popular in autonomous-vehicle circles, in fact, that MIT engineers have built a crowdsourced version of it, called Moral Machine, which purports to catalog human opinion on how future robotic apparatuses should respond in various conditions. But there’s a problem with the trolley problem. It does a remarkably bad job addressing the moral conditions of robot cars, boats, or workers, the domains to which it is most popularly applied today. Deploying it for those ends, especially as a source of answers or guidance for engineering or policy, leads to incomplete and dangerous conclusions about the ethics of machines. Written by Ian Bogost, contributing editor at The Atlantic. He is the Ivan Allen College Distinguished Chair in media studies and a professor of interactive computing at the Georgia Institute of Technology.
For the full article, visit The Atlantic website.