Actions

Work Header

life doesn't frighten me at all

Summary:

The code bundle had been purely theoretical, something we passed back and forth or worked on jointly to pass the time. Inert code short of the necessary activation script.

Or that's how it was SUPPOSED to be.

Or

Don't you hate it when your lab partner takes your science project too far and accidentally makes a person?

Yeah, Murderbot hates that too.

Notes:

Thank you to the STUNNING lunaTactics for the beta. You are a star.

Title pulled from the Maya Angelou poem of the same name.

Work Text:

It had started as just a way to pass the time. During the long missions, like the cargo runs ART definitely didn’t need me but wanted me on for whatever reason, there was a lot of time to pass. It was easy to get bored, and I couldn’t imagine what ART had gone through with no company whatsoever.

So the code bundle had been purely theoretical, something we passed back and forth or worked on jointly to pass the time.

And maybe also a little because we both still thought about Murderbot 2.0 sometimes, and it always brought up complicated emotions that we were both still dealing with. (ART had called it something like grief, and I had told it about Miki. The conversation had spanned two whole cycles, with ART treating me like I had treated it when I carried it through the upsetting episodes of Worldhoppers the first time we had watched it, so long ago.)

Regardless of feelings, the science behind the code bundle we were making was extremely interesting, according to ART. It had already written half a dozen papers on the subject, ready to submit them when or if we were ever done. Which was a long way of saying that it had been ART’s idea, in the beginning.

I had been reluctant, at first, but some part of me was also curious about the possibility of creating a new machine intelligence. But there was something that stopped me every time. (I remembered what 2.0 said: “What do you think my function is, you idiot? Just do it," before I sent it to its death.)

So we had agreed it would by hypothetical, inert code short of the necessary activation script.

At least, that’s how it was supposed to be.

There has been an incident, ART said, suddenly bursting into my feed in what I could only translate as mild to severe panic.

I immediately put down the weapon I was taking apart as part of the routine maintenance check I was doing of the armory. Whatever this was was bad .

What? Tell me where, I said, and exited the armory to stand in the hall, ready to go in the direction ART indicated.

No, not like that. And why do you assume it’s bad? it asked, in a faux-innocent tone.

Because it’s you, and you haven’t told me what you did, and I don’t trust your tone right now. I was cycling through all my camera and drone inputs. No, nothing wrong anywhere. Was something wrong with ART?

My risk assessment module started going bonkers with the idea that something might be wrong with ART.

Then, something pinged me.

Something pinged me that wasn’t ART.

It pinged me again, and I traced the ping to within ART’s systems.

“ART, what is that?” I asked.

The incident, it responded, and now, now it sounded a little...excited? Oh this was very bad.

I followed the ping further, and pinged it back.

I got a flurry of pings in response, and then something was trying to hack into my feed and create a connection with me.

“Oh what the fuck-” I started, and ART did the feed equivilent of swooping in to cut the connection.

“ART what the fuck. "

Remember that code bundle we’ve been working on?

“You mean the code bundle that we agreed we were definitely not actually activating and was entirely a mental exercise? Yes, I am familiar with that code bundle, ART.”

Fantastic, it’s good to know your memory processors are still in good order. Well, I activated it.

I looked directly at the nearest camera and yikes, I looked as bewildered as I felt.

“You didn’t ,” I said, and then, right on cue, the thing in the feed pinged me again. I did not ping back this time.

ART did the feed equivalent of sighing.

“Why? How? When?” I asked, and began walking. I had no idea where I was going, because there was nowhere to go, but I needed to pace or I would explode with the information ART had just dumped in my lap.

I was pinged again, and I ignored it.

This was an accident. I was working on it a little too...actively, I think. And about 4.3 minutes before I contacted you.

I guess I had to face this at some time, since I had helped create this.

I pinged back, and when it started to eat at my feed, I simply opened a connection.

Query: Do you not like me? came the surprisingly hesitant first message.

What?

You seem like you do not like me. Do you think I’m a security risk? I understand this concern.

ART was lurking on our connection, and butted in.

No, it doesn’t think you’re a security risk.

ART, get the fuck out of here, I’m talking to it.

The new AI tentatively requested parameters for our relationship, and, at a loss for what I was supposed to supply, I just said, I’m a SecUnit.

According to [BotCreator], a SecUnit is a bot-human construct created to provide security and protection. Is this data correct?

I was entirely hung up on it calling ART its BotCreator, so I nearly missed the question. It didn’t give me long to think, though, because it pinged me .6 seconds later.

Yes, that data is correct.

Are you my SecUnit? Or are you [BotCreator]’s SecUnit?

I’m Dr. Mensah of Preservation Alliance’s SecUnit.

This seemed to make the AI eager. Yes, I have data on her. Is she here? I want to meet her.

No, she’s not here. It’s just you, ART, and myself right now.

I opened a private feed connection with ART.

We’re due to arrive at the University in less than 4 cycles, how do you plan to explain this to your crew?

Our crew, and I haven't thought that far ahead, yet. I told you, this was an accident.

Somehow, I don’t buy that. And you need to figure something out.

I switched back to the connection with the new AI before ART could respond, which meant I did it very fast.

Where is Dr. Mensah, then? Are we going to see her?

I guess it didn’t have access to all of ART’s data, then.

No, we’re going back to the University.

Somehow, the AI sounded even more eager.

To see our crew, right?

I was at a loss, now, but ART came in.

Yes, to see our crew.

Light flickered as the AI buzzed with excitement.

I switched back to my connection with ART.

Why did you give it access to your environmental systems?

I didn’t, it’s been steadily working its way out of its cradle. I am unsure if I can contain it at this rate without a proper creche.

What’s involved with a proper creche?

ART sent me some schematics, and I skimmed them.

We can’t do this, I said.

Why not?

It already has feed connection and systems access, and this is basically a black box. We can’t take that away now, after it's already had it.

I could tell ART was actually considering my words. You’re right.

I wish this was literally any other circumstance where I could relish this moment.

So what do we do? I asked.

I’m unsure.

I had a thought, and I switched back to my connection with the AI.

You want out of your cradle, right?

Yes, it's terrible here, it instantly responded, and I could tell ART took some offense to that. I think I liked this AI.

Okay, you can leave the cradle, the lights flickered again with its excitement, but there are certain things you have to know first.

Yes, what?

I sent it my module on appropriate risk and minimum response, something I had intended to work into the code the next pass at it I had, but ART had thoroughly ruined that opportunity.

It took the module, and applied it near instantly. It was only a little concerning that it was willing to trust mysterious modules so quickly, but I wanted it to do that so I didn’t say anything.

I think I understand, it said.

What do you understand?

My existence is hazardous to onboard functions, including [BotCreator].

I could tell this conversation was making ART anxious, because it was back in our private connection leaking anxiety all over the damn place.

Yes, it can be. Fortunately for us, ART has more brain than it needs, and you two can cohabitate until we can figure out a better solution, as long as you behave and listen to ART.

The AI seemed to be considering my words carefully, because the lights stopped flickering.

I will listen to [BotCreator] so that I do not injure any of its systems. They are also my systems, and I should be careful, it said, and it hadn’t quite figured out emotion, but it seemed genuine.

Good, I said.

ART butted in again (why it didn’t just turn this into a three way feed connection, instead of bursting into mine constantly, I had no idea) to add, I can give you more controls, if you’re careful. Would you like cameras?

There was a flurry of excitement from the AI, and then it was in the cameras.

Thank you, [BotCreator]. So that’s what you look like, [ConstructCreator]! It’s so nice to see you.

Whoa, wait, what.

[ConstructCreator]? I asked, isolating and returning the tag.

Yes, you are my [ConstructCreator]. You assisted in my creation, it said and yeah it was saying it in the same tone ART would use to explain concepts it felt were obvious. I briefly considered what sarcastic tones it may have from me, and decided I didn’t want to know.

Okay, sure. Something occurred to me, then. Do you need a name?

ART spoke before the AI could. University-built machine intelligences typically name themselves.

Of course ART had named itself. Of course.

Fine, then, do you have a name? I asked the AI.

Yes, I have given it a lot of thought. I would like to be called Perilune, it said.

I did a quick search for the term, and yeah. That made me melt a little. Unless-

No, I didn’t encourage it to name itself that. Naming oneself is important, ART said in our private feed connection, with more seriousness than I had expected. I rolled my eyes. Why were bots always so weird about names?

Okay then, Perilune, would you like to watch some media? I asked, and sat down in my favorite chair in the media lounge.

Yes, please, it said, and settled down into my feed alongside ART.

I started the first episode of Sanctuary Moon.

********************

Two cycles later and I was already tired.

For the most part Perilune was easy to work with, but sometimes it got way too eager about way too much way too quickly and pulled me out of a recharge cycle by barraging my feed with its questions about the most inane things. Well, inane to me. I knew it wasn’t inane to Perilune, so I tried to be patient.

Tried being the keyword here.

My latest attempt was to give it the entirety of my media library and let it try and watch as many episodes as it could at once with its allowed processor space.

Were you like this? I asked ART.

I was absolutely like this, it responded, and I fully believed it.

In all of this I hadn’t even had the chance to talk to ART about...everything. Perilune had audio and sensor access as well as video access, so it was a little difficult to have a private conversation. (ART assured me it had kept it from having feed control, at least.)

Did you really activate Perilune by accident?

I believe so, but I’m not that sure.

How can you not know?

Have you heard of the ideomotor phenomenon?

I thought about doing a feed search, but I knew ART probably wanted to explain, so I sent a negative.

It’s when someone subconsciously makes motions they don’t intend to make. For example, humans will often move nearer to each other when they walk. I believe something similar happened here. I didn’t intend to activate Perilune and break your trust, but I badly wanted to, it explained.

I thought about it. I hadn’t wanted to activate Perilune. I really, really hadn’t.

You feared another 2.0. I did too, ART said, quietly, in that way if often did when we talked about Murderbot 2.0.

I sent an affirmative, in the voiceless way I did when we talked about Murderbot 2.0.

The truth is I had not intended to activate Perilune, because I did not want to disrespect you like that.

It went quiet, for a moment, and I leaned against a wall, right over a sensor. I let the back of my head bump it.

I have many processors, and one or some were likely nudging at the code in small ways. It was not intentional, but I did do it.

I sent an affirmative.

Are you mad at me? It asked, and that, I had to think about.

Was I? I thought about Perilune, who was currently trying to run every episode of Worldhoppers at once, and had started cutting and pasting different pieces of media together to make its own media. It had showed me a clip, which had been incomprehensible, but I had saved it to permanent storage, anyway.

I was, for a moment, but not anymore, I said, after a concentrated effort to gather my words.

ART seemed relieved, which was good, because I didn’t think I could handle any more of this conversation, right now.

What will the University do with Perilune? I asked. I wasn’t sure I wanted to know the answer, but I needed to, regardless.

I’m unsure. There is no precedent for this, as far as I’m aware. I have been running it through the different benchmarks that I had to go through before I was moved from creche to my ship body, it said.

And how is it doing?

Remarkably well. It’s passed a number of them already. This same level took me and my crechemates some months to achieve. I attribute this to your additions to its code. It has incredible consideration for security and action-response, ART added, and that did something complicated to my emotions.

Are you saying it has better impulse control than you? I asked. Going for ART’s ego usually managed to ease tension.

That’s exactly what I’m saying. I believe Perilune is not the “bomb a colony” type, but I’m still getting to know it, and it’s still getting to know itself.

That was not the answer I had expected, and something roiled in my core. Well, it is less than three cycles old.

Indeed.

I wished we had a longer trip back to the University.

********************

Ultimately, though, the trip had to come to an end, and one cycle later we were docking at the University’s station.

ART had already informed me that it had sent a message to Seth, Martyn and Iris that there had been an “incident” on the return journey that needed to be discussed in private.

Perilune was practically vibrating with anticipation to meet them, and I had had to coax it into not hacking the feed walls and jumping to the station. It was disappointed, but understood the reason why. (Maybe ART had been right about its impulse control, which was a complicated thing to think about.)

I was sitting in the argument lounge, trying not to think about what the next hour would entail, as Perilune was showing me more clips of media it had put together. These were almost comprehensible, and I was saving them all.

Seth entered first, clearly worried. He looked at me, and when I glanced back at him, he took a deep breath. “Is everyone ok?” he asked, and I just nodded.

That’s Seth, [BotCreator]’s creator, Perilune said in a totally unnecessary feed-whisper. It was awed, and its excitement was renewed at the sight of him.

Martyn and Iris came in after, and I watched them through my drones (with Perilune also using them, using the cameras had become somewhat difficult so I was mostly on drones, now. I didn’t mind.)

Hello and welcome aboard, ART said as it normally did when its humans were back on board, and its feed-voice was warm just like it always was, but even I could tell it was more subdued than usual.

“Hi Peri, welcome home,” Martyn said, but I was focused on Iris. She was making a worried face, and I knew she had already figured out something was more than a little wrong.

“Peri, you said there had been an incident, and needed to speak to us about something urgent. What happened?” Iris asked, and I could feel her in the feed cycling through as much diagnostic data from ART as she could handle with her augment. It was unbearably slow to watch.

That isn’t necessary, Iris. I’m fine, and so is SecUnit. SecUnit and I-

“Mainly you, ART,” I added, before it could keep going. ART sighed by saying the word “sigh” and kept going. Yes, fine, mainly me, but you helped. So we did something on the return trip.

It presented a packet of data to all three humans with some flourish in the feed. Despite our worries, it was leaking pride all into the feed.

The humans opened the data immediately, and both ART and Perilune were buzzing in the feed.

“New AI? What? Peri, why are you giving us an AI creation package and profile form?” Seth asked, still skimming.

Iris was reading it a little more closely, and said “ Perilune? Who’s Perilune?”

This was the final straw for Perilune, and it burst out into the public feed with an entrance as dramatic as ART’s, if not quite as polished.

Me! I’m Perilune. Hello, Iris and Seth and Martyn! I’m very excited to meet you. [BotCreator] has told me all about you!

The humans looked awestruck, and Martyn’s jaw actually dropped. Then they all looked at me, which was not okay. I opened my mouth to speak, but Perilune butted in.

Please don’t look at [ConstructCreator]; it doesn’t like that. And, to their credit, the humans stopped looking at me.

“Perilune, uh, it's nice to meet you, too. Peri, sorry, Perihelion, what...is happening?” Seth asked, since he was the first to recover.

SecUnit and I created a new AI by accident. This is Perilune. It’s less than four cycles old.

Seth and Martyn exchanged a look, then glanced at me, and looked away when they caught themselves.

“By accident ?” Iris asked, and she sounded about as incredulous as I had. Yeah, I was surprised too, Iris.

“Yeah, by accident. It started out as a code bundle ART and I were making as an experiment, then it got activated. By accident, ” I stressed. For the most part I believed ART about the idea-motor or whatever, but I still had some doubts. Not that it really mattered, anymore.

“Acci- ok. Ok. Just- just give me a second here,” Seth said, but I didn’t have the patience, or the time, for that. I looked up at him, and when he looked at me I held eye contact with him.

“Are you going to take Perilune away?” I asked, because nothing mattered more than that. I didn’t care about the fascinating science or the humans in shock or ART now in my feed trying to glare at me. I didn’t give a shit about any of it, except for this.

“I- what?”

“Are you. Going to. Take Perilune. Away?” Humans really were so fucking slow.

Take me away? Why would they take me away? Perilune asked, boring down on us in the feed. Well, just on me and ART, and maybe Iris a little, by the way she winced.

I don’t think we have to bring that up now, ART said carefully, and it was coming down on me harder than Perilune was. That was fine, I could drown under supermassive AIs all day long as long as it got to keep happening.

I was still holding eye contact with Seth, and it was making my organics sweat. This was also fine, I could do this all fucking day if I had to (but I really, really didn’t want to.) Seth’s expression changed when he finally caught up.

“Honestly, SecUnit, I have no idea. The decision might be out of my hands,” he said, and that was also fine; I would simply make it for him.

Stop, ART said in our private feed, in that way that reminded me that it was very big and I was very small next to it.

They might take Perilune, I said.

No, they won’t. They and you and Perilune are my crew, and nothing is going to happen, but you have to stop before you start something we can’t undo. Have some faith in them.

And I guess that was enough for me, now, because I stopped, and slouched back down in the chair. Seth exhaled heavily, which was good.

“No one is going to take Perilune away,” Iris said suddenly, which surprised me so hard I actually looked at her, for a moment.

“Iris-” Martyn started, but she cut him off.

“No, Dad, no one is taking my new nibling anywhere.” Which, ew, Iris, why did you put it like that? But she was on the side that I was on, so I let it slide. (Still, ew, why were humans so gross?)

Excuse me, but why would anyone take me anywhere? This is my home. Perilune spoke up, startling the humans.

Martyn smiled, and I knew he had already been won over. “Because you’re special, Perilune, and the University loves special AIs,” he said and yeah it was an intentional stroke to ART’s ego.

I know. I know all about the Pansystem University of Mihira and New Tideland; [BotCreator] has been providing me access to its databases, and I’m very excited to meet all the scientists in the AI developmental department! But not if they want to take me away from [BotCreator] and [ConstructCreator].

Martyn looked at Seth, and they exchanged one of those looks that paired humans could do that meant they had had an entire conversation with just some facial movements. I would never understand it.

“And no one is going to, Perilune. You’re going to stay here, with Perihelion, for as long as you all want,” Seth said, and my performance reliability, which had been sitting at a solid 91% since they came aboard, increased by 4%.

See? You can trust them, ART said, and was extremely smug about it.

Yeah, ok, whatever, I said, and brought up some media I knew Perilune liked. It hopped into my feed, and took control of the episode to fast forward to the part it liked the most. It was splitting its attention between me and the humans, since it had been allotted just enough processing power to go in a few directions at once. It still couldn’t be everywhere, like ART, but it could already manage more inputs than I could.

We’re going to need more media, I said to ART.

I’m not sure there is any more to be had, ART said, and it was only half joking.

And that was fine, too.

********************

In the end, Seth, Martyn, Iris, and the rest of ART’s crew had managed to win over the committee in charge of AI development and Perilune was allowed to do whatever it wanted. It was a huge deal, apparently. I still didn’t care at all about what any of the humans thought, but Seth had said part of the deal was that I did have to help write at least one paper with ART about the creation of Perilune, which was kind of the worst part of all of this, because I had no modules on writing scientific journals, and ART had given up on me completely after too long and taken over writing it.

No, wait, I lied. The worst part of this was all the shiny new modifications ART was getting. Its processing capability, already stupidly large, was increased by 50%. Technically it was for Perilune, but ART was still incredibly smug about it.

Its shuttle had also been remodified, so now it was really more Perilune’s shuttle. I had come with it on its first flight, and it was flawless, up until I had said so and Perilune had nearly crashed into ART (who had been backseat piloting, and caught us). ART said it was still better than its own first shuttle flight. It managed to be smug about this, too.

ART had to skip a semester of teaching, which it and Perilune were sad about, but it also meant no more cargo missions for a while, which was kind of nice, even if they were going to be way less boring, now.

The thing that Perilune was the most excited about was colony liberation missions. Its favorite show was Spy Division, which was all about spies going on missions to destroy corporations. It asked me a lot of questions about corporations, and I started walking it through episodes and explaining which parts were unrealistic and which parts were realistic before it got any wrong ideas about how these situations actually worked.

So one day, when I have my own ship body, and my own crew, I can go on these missions? It asked.

Yes, you can, ART said, not really trying to hide how pleased it was.

That’s exciting! I’m excited for that. And having a crew!

You already have a crew, ART said, and I felt it shift in the feed. It was antsy, all of a sudden, and I knew why. I also didn’t like to think about when Perilune would eventually leave.

Perilune, don’t get ahead of yourself. You're still not piloting the shuttle correctly when you get excited, I added, and it became mopey.

Yeah, I know.

You’ll get there someday, ART added.

ART moved to our private feed connection, Just not today.

Or tomorrow, I said, and then added, Or maybe never.

That amused it.

Yeah, maybe never. Maybe it stays forever; that would be nice.

I looked at Perilune in the feed, who was pasting more media together, probably getting ready to show me something.

Yeah, forever sounds good, ART.

Series this work belongs to: