As you may know Asimov law 1 is “You may not harm a human, nor through inaction allow a human to come to harm”
In a recent game, I was subjected to a decision that ultimately got me banned for a few days from AI, and while this is not an appeal of that admins decision I want to know what peoples opinion is on it as it’s a profoundly interesting dilemma to discuss.
It’s LRP, you are the AI with the Asimov lawset. A xeno egg has spawned in xenobio. Naturally, scientists breed up a xeno queen and they escape. there are 14 connected players, most of which do not have suit sensors on and many others are SSD. The xenos have captured several humans and have bred more. The crew are incapable of fighting off the xenos.
You have an idea.
You start prepping atmos with a oxy-plasma mix and start building pressure in the mix chamber. you inform the crew that made it to departures you plan to burn out the aliens with a controlled plasma fire once they are safely on the shuttle. You start plasma fires in departures and in the main hall.
The xeno queen comes out near departures and you trap her in such a way she burns to death, allowing the shuttle to leave. 4 people managed to escape however unknown to you, 2 people died to the plasma fire. Human harm. Banned from AI for a week.
What I find interesting about this scenario is the paradox it causes with law 1. You cannot harm a human BUT you ALSO cannot let a human come to harm by doing nothing. If the only thing left you CAN do to save humans is flood plasma, aren’t you by law 1 also required to do it?
Alternate scenario, The crew in departures get onto the shuttle and you don’t flood plasma. You lock the doors in departures and wait for the shuttle to leave.
Hostile environment, shuttle will not leave until the queen forces it or she dies.
It’s only a matter of time before she gets onto the shuttle, and she’s space proof so you can’t suffocate her by siphoning. She gets on, she kills the rest of the crew, shuttle leaves, hijack successful.
While it’s unlikely that this is going to get the AI a ban, it should. you took no action to prevent harm. Nothing you did would stop the xenos from killing the rest of the crew. You failed to prevent human harm even though you could of.
The problem is that both the “do not allow harm to humans” part of the law and the “don’t allow humans to be harmed by not doing anything” part are of equal priority. If you do not flood to save the humans who are going to escape, you’ve broken the law. You plasmaflood and it kills anyone who isn’t a xeno (most likely due to their own incompetency or being SSD in the wrong place), you’ve broken the law.
What are your thoughts on this paradox? What should be done to try to fix this? Was I right to plasma flood to save the many at the the cost of the unknown few? What would you have done instead?
- Plasmaflood
- Let crew die
- Other?
- I agree this is bullshit
0 voters