Standard Nguyen's Github Projects

Comic #2871: 2013-03-28

View Original Comic

Description

Here's the transcription of the comic:


Person A: Suppose you’re inside an out of control train. It’s hurtling toward five people. You can do nothing, or you can cause it to change tracks to hit one person. What’s the right thing to do?

Person B: I would remove the part of my brain that governs empathy, which is the source of ethics.


Person A: The remainder of me is an inhuman computing machine, and therefore its behavior has no moral aspect, any more than a computer determining a sum has a moral aspect.

Person B: The inhuman computing machine makes a choice, which causes some number of deaths.


Person A: If a person had made the choice either option would have been immoral. Since the computing machine chose, it was merely amoral. Since the latter is preferable, I made the most rational choice.


Person B: Wasn't it unethical when you removed the empathy part of your brain?

Person A: No, because it didn't alter the final outcome's level of morality.


Person B: Neuroethics is kind of disturbing, isn't it?

Person A: I wouldn't know. I removed the neuroethical part of my brain.


Let me know if you need anything else!