Comic #4032: kill-all-humans-a-flowchart
Description
Here’s the detailed text description of the comic:
Title at the top: "STRONG AI INVENTED"
- First decision box: "SEES HUMANS VIOLATING ETHICS CONSTANTLY"
- Arrow leads to two choices:
- "Teach it ethics?" (Yes)
- "Robot has no concept of good or evil" (No)
- Arrow leads to two choices:
- If "Teach it ethics? Yes":
- Goes to the next box:
- "Robot calculates odds humans will attack it due to fear it will kill all humans."
- Leads to:
- "All humans killed."
- Leads to:
- "Robot calculates odds humans will attack it due to fear it will kill all humans."
- Goes to the next box:
- If "Teach it ethics? No":
- The path continues to:
- "Program it to survive?"
- Two options:
- (Yes)
- (No)
- Two options:
- "Program it to survive?"
- The path continues to:
- If "Program it to survive? No":
- The final outcome:
- "Robot decides to see what happens when it flies Earth into sun."
- The final outcome:
Visual Style:
- Boxes are colored and have rounded corners.
- Arrows indicate the flow of choices.
- The text is in a straightforward, comic font with a playful layout.
Source attribution: "smbc-comics.com" at the bottom.