About This Blog

Automating Invention is Robert Plotkin's blog on the impact of computer-automated inventing on the future of invention and patent law.

Click here for latest videos.

News Feeds

« Who "writes" a reality TV show? | Main | Biological Viruses Spread by Computer Viruses? »

June 23, 2005

Are programmers responsible for harm in virtual worlds?

The videogame 25 to Life is getting slammed by Senator Charles Schumer (D-NY) for allowing players to take the role of criminal gang members and kill police officers. The game, published by Eidos, lets players play scenarios as either police or criminals.

This reminds me in some ways of the speculation after September 11, 2001 that terrorists had used Microsoft Flight Simulator to practice flying planes into the World Trade Center, and Microsoft's subsequent decision to remove images of the World Trade Center from the software. Both 25 to Life and Flight Simulator allow their players to perform actions in virtual worlds, including actions that would be clearly immoral and unlawful if committed in the real world.

One important difference between the two, however, is their generality. Flight Simulator is a general-purpose flight simulator, allowing the user a wide range of choice about whether to fly peacefully over the Atlantic Ocean or to crash into skyscrapers. 25 to Life provides the user with a much narrower range of choices. Based on the descriptions I've seen, the user who plays the role of criminal has to commit crimes to win the game.

I think this distinction has ethical implications for computer programmers. We don't typically consider the inventor of a device that has a wide variety of beneficial and harmful uses (such as a hammer) to be morally responsible for someone else's use of that device to harm someone. But we at least take seriously the ethical responsibility of the inventors of devices (such as weapons of mass destruction) whose primary purpose is to hurt people, even if the inventors themselves never use those devices in harmful ways. Even inventors of "general purpose" inventions have questioned their own responsibility for the harm caused by their inventions, as in the case of Alfred Nobel, who established the Nobel Peace Prize after seeing his invention -- dynamite -- being used to make war.

Artificial creativity software adds an additional layer of indirection between the inventor/programmer and the resulting (real or virtual) harm. If Eidos had released a general-purpose simulator in which players could live any life of their choice, would (or should) Eidos have received any flak if its customers decided to play as characters who killed police officers? Would Alfred Nobel have had a clearer conscience if he had written artificial creativity software that he -- or perhaps someone else -- then used to produce the design for dynamite?

I am curious to hear feedback about this, in particular how views of the ethical reponsibilities of inventors have varied over time and from culture to culture.

Posted by Robert at June 23, 2005 5:27 PM
category: Ethics

Comments

Post a comment

Thanks for signing in, . Now you can comment. (sign out)

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)


Remember me?