Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • MicrowavedTea@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I don’t know if the question comes from there but that’s the exact plot of White Christmas in Black Mirror. I’d say if you build something with the ability to suffer then its suffering matters. Not sure how you would prove that though.