Standard Nguyen's Github Projects

Comic #4032: kill-all-humans-a-flowchart

View Original Comic

Description

Here’s the detailed text description of the comic:

Title at the top: "STRONG AI INVENTED"

  1. First decision box: "SEES HUMANS VIOLATING ETHICS CONSTANTLY"
    • Arrow leads to two choices:
      • "Teach it ethics?" (Yes)
      • "Robot has no concept of good or evil" (No)
  2. If "Teach it ethics? Yes":
    • Goes to the next box:
      • "Robot calculates odds humans will attack it due to fear it will kill all humans."
        • Leads to:
          • "All humans killed."
  3. If "Teach it ethics? No":
    • The path continues to:
      • "Program it to survive?"
        • Two options:
          • (Yes)
          • (No)
  4. If "Program it to survive? No":
    • The final outcome:
      • "Robot decides to see what happens when it flies Earth into sun."

Visual Style:

Source attribution: "smbc-comics.com" at the bottom.