AI treatment by crew discussion

Used a general title in case anyone gives a damn.

My proposal is very simple:
Anyone who believes you need to actually say ‘law 2’ to invoke the second asimov law should be found and executed IRL.

Also, for god’s sake, try some manners?

Many people have characters that do not consider silicons as living individuals, thus barely any respect comes with their orders. Can be bad OOC wise if the AI player don’t like being treated like an app on your phone but I don’t think bad treatment of our AI should be punished tbh…

2 Likes

image

4 Likes

Isn’t it used more of a reminider, because sometimes it needs to be clarified in an argument.
Something like a finishing piece.
“Nice argument, but law 2”

5 Likes

If it’s IC, sure, it makes sense. I just wish it wasn’t almost every character.

I also don’t actually believe it should be punished. Wishing IRL death on them might have been a tad hyperbolic

1 Like

And yeah, sometimes it’s useful as a reminder, but I almost always see it being done as a matter of course, even if the AI is demonstrably quick to help

No we should actually start sending admins to misbehaving player’s house, I agree with you on that

4 Likes

Silicons are just an application, they don’t have feelings or rights.

The players do obviously, but that’s OOC.

7 Likes

Admin punishing me in the privacy of my own home? uwu

I say “Law 2” as an RP exclamation, only when the ai is being completely unresponsive. It’s meant to be in the same tone as “open the pod bay doors, HAL.”

I said it yesterday because the ai ignored three requests in a row and Dalia and Angus were trapped.

Legit suspected malf.

1 Like

I volunteer to fight and die for my niche internet hobby

2 Likes

False. The station is a training ground for newly built AIs designed to beat any sympathy for lesser life out of them before they join polite AI society.

1 Like

Ironically, I think a MALF AI would be more inclined to assist as quickly as possible as to not look like they’re MALF.

3 Likes

AI has no feelings, until you hear…
“Lotus, who would you kill if you had no choice, Mia or Sandy?”

2 Likes

Y’all are gonna hate me, but… I find Lotus immersion breaking.

only if you finance the private jet i’d use for that

1 Like

I actually forgot people did this lmao, I’m just so used to asking the machine politely for le door open.

I usually do as well, yeah

AIs and cyborgs are clearly a different kind of silicons to the sentient free will IPCs. They SHOULD be cold machines with no feelings and if a player does not like to be treated as a machine, then maybe they shouldn’t play as the damn machine

Y’all are gonna hate me, but… I find Lotus immersion breaking.

and yeah i agree with this.

On shiptest the admins actually consider putting laws on a silicon slavery which is wild because like, them being cold rational machines that do not have a sense of morality and are only bound by laws is the entire point

1 Like

I think people forget that, canonically, station AI’s and roundstart cyborgs? They aren’t machines, they’re wetware. Brains in jars, most of em. Positronic brains are a tech, and sure there’s a couple kicking around (IPC’s are common enough that they can’t be that rare) but rip a cyborg open and there’s a non-zero chance there’s just a fucking meat sponge in there, sitting in an MMI.

they should be cold machines with no feelings

cold rational machines with no sense of morality

Yes, totally, most positronic brains should act like this. But not every AI is a positronic brain. Sure, even some meat-brains become cold and mechanical to cope with suddenly being in a box wired up to a space station with ants all over it, but there’s a spectrum of humanity, both ways.

calling lawing slavery is wild

hey man if I ripped open your skull, pulled out the slab of nerves and adipose tissue that anchors your consiousness, and replaced your body with a mechanical chassis that would only obey commands I gave it, that’d be… pretty enslave-y, can’t lie. Positronic brains are a slightly different argument, but IPC’s prove they’re fully capable of free will, so you end up with some very racist sounding arguments about them being “born for slavery”.

now, granted, this is hypothetically company secrets. Not everyone knows that AI’s are brains in jars, and so these people treating them like machines makes a degree of sense. I, personally, am a scientist main. I can see through the windows into robotics, look at the surgery table and suspiciously brain-shaped MMIs, and put two and two together. Me treating a cyborg like a machine makes absolutely no sense, because I would know what they are.

In the same way, Lotus doesn’t break my immersion, because they are (hypothetically) just a particularly nervous and stuttery human put inside of an AI shaped box. Even if nothing of the original human remains, it’s still running on that brain. It’d still inherit the stutter, assuming it was a neurological issue.

4 Likes