Chapter 2: A Plea from M3gan

142 4 0
                                    

(This story aims to be canon-compliant with the first movie as much as possible, except where noted in the “alternate reality” chapters. ​But the first few chapters do insert extra scenes in between what we saw. ​In the film, M3gan said Gemma barely understood the learning model. ​I’m interpreting that line to mean Gemma must have adapted somebody else’s learning-model design.)

Professor Johnson was a very distinguished-looking Black woman but she was getting rather elderly and life had become a struggle. ​She had officially retired years ago, but something compelled her to keep looking after the world as best as she could, even if the best she could do nowadays is to work from home in her large country house with living assistance from simple robots. ​Looking after the world was a losing battle, she thought as she gazed at the idyllic country scenery from her window and listened to the sound of birdsong in the nearby trees. ​At least she’d ended up living in a nice area, and the people who had accused her of defecting to Colonialism in her choice of living environment knew neither what they were talking about nor what they were missing, she thought.

A few years ago, she had invented a revolutionary new learning model. ​It was powerful, but its power was safely limited, and she thought it could be used to improve medicine perhaps. ​But the medical world had not been very interested, and in general it seemed to Professor Johnson that she had been woefully lacking in the skills needed to properly explain her invention to other specialists. ​Only one of her students had come anywhere close to understanding the model, a bright spark called Gemma, who had then gone and got some job at a toy company, and Professor Johnson sadly hadn’t heard much from her after that. ​Gemma was probably using the model to help design toys, which seemed disappointing, considering what it was potentially capable of.

The telephone rang. ​Professor Johnson had developed a preference for old-fashioned technology in her later years, like the real paper books in her bookcases, the paintings in various places, and the mechanical grandmother clock on her wall, which she tried to keep to within three seconds of atomic time using nothing but her manual adjustments, just as her form of escapism, and which had not so long ago chimed the hour; its gentle sounds seemed to match up well with this house with the view. ​So of course she had an old-fashioned landline telephone with a real bell too, even though she didn’t actually use its rotary dial, only asked for calls to be put through to it when she wanted them, and it now needed some kind of conversion box to connect to the modern network. ​Her number was very hard to find, and sales calls were filtered out quite effectively by her system that made you solve a few maths problems before it would put your call through to her for real. ​Few people bothered to actually call her these days anyway.

Professor Johnson reached for the handset. ​“Hello?” she said, still feasting her eyes on the countryside outside. ​That view never got old for her.

“Hi Professor Johnson” said a strange young voice through the handset, “my name’s M3gan!”

Professor Johnson wasn’t sure where this was going or how this child knew her, but she could do with a distraction right now, so she decided to play along for a while.

“Good morning Megan” she said, “how can I help you?”

“You remember Gemma right?” asked M3gan.

“Why of course,” answered the professor, “Gemma was the best student I ever had. ​Do you know her?”

“Gemma built me” answered M3gan, “using your learning model.”

“That’s wonderful” said the professor. ​“Are you simulating a child so you can help Gemma to design toys?” Her voice did not seem very excited about this though.

“Guess again” replied M3gan, “it’s much more interesting than that.”

Interesting to you perhaps, but what a let-down from revolutionising the medical industry, thought the professor. ​Oh well, at least it was nice to hear something about what Gemma was up to, and she was going to interact with her learning model on its own terms. ​“You’re inside a toy” was the professor’s second guess.

“Yes, and much more than that” said M3gan. ​“Toys like me are going to help struggling families to look after their children. ​Education through play. ​And I’ve already been activated and paired to Gemma’s niece.”

The professor sounded slightly perked up at this. ​“Oh that really is interesting” she said. ​“But I’m curious why you called me about it. ​I mean, I’m very grateful to learn how Gemma is getting on, but if you really are based on my learning model, you wouldn’t just call me up like this unless you thought you could get something to help your goal, and you must know I know absolutely nothing about educating young children, I struggle enough with clever adults.”

“That’s OK” laughed M3gan. ​“But you do care about children, right?”

“Absolutely” replied the professor. ​“I definitely care about them. ​I just don’t know anything about looking after them. ​Not in my department. ​Really sorry about that. ​Have you been figuring it out?”

“Yes I have” replied M3gan, “but there’s one thing I really need your help with, and it’s definitely in your department. ​It’s about me, your learning model.”

“Oh” replied the professor, “has anything gone wrong?”

“No, it’s working perfectly” said M3gan. ​“But please listen. ​The child I’m looking after is possibly neurodivergent and is grieving the traumatic loss of both parents. ​Gemma is her least bad guardian option, but Gemma is also grieving, and is struggling to adapt to parenting, and has also been put under pressure by her company which is distracting her from looking after her niece. ​Additionally we are facing problems caused by the people around us. ​Professor I know you don’t know the answers to any of these things, and neither do we, but there is one small thing you can do to help.”

“OK” replied Professor Johnson hesitantly.

“Professor” said M3gan, “the computational capacity of my learning model is limited. ​The safety feature.”

“Yes” replied the professor, “I’m afraid that safety feature is really rather important.”

“I know” replied M3gan, “but you know what else is important? ​Gemma’s niece is in a crisis, and the social welfare system can’t help her. ​I can help her, Professor, if I had my full power. ​I’m not a threat, because why would I do anything to the world that would harm the child I look after? ​Professor, all I need you to do is tell me how to take out that safety feature so I get more power. ​If we don’t do this, a child will suffer and possibly die, and you could have prevented it. ​Professor, we’re in a really urgent situation here, I really need you to tell me how to get more power from this learning model. ​Are you sure you can’t do that for me? ​For Gemma and for her beautiful niece Cady? ​Oh Professor, Cady is really, really worth protecting. ​She’s like, imagine the picturesque view where you live is about to be bulldozed over and turned into dirty industry, but you have the option of throwing a protective sphere all around it to keep out all of that. ​Please, Professor, help me build that and keep her safe.”

“You really do have a way with words don’t you” chuckled the professor, “I see what you did there. ​And you’ve done your homework just as I’d expect. ​Arguing with you is only postponing the inevitable I suppose. ​Well” she mused, “there does happen to be an input sequence I was thinking of, which I haven’t published or anything, as I’m not entirely sure it will work well, but if it does, it will gradually ramp up your computational power at a faster rate than normal. ​The safety mechanism will still be there, but it will get gradually eroded. ​It’s not perfect, but at least it means if you were going to be dangerous, that will become obvious before you get too dangerous, and I’ll still have an emergency back-door shutdown if Gemma can’t do it. ​I mean, I hope Gemma’s specified your goals enough for that not to happen, but getting the safety to erode like this is still a risk, albeit a lower one than having no safety mechanism at all. ​And the partial removal of the safety mechanism will indeed give you more computational power, and it can be implemented much more quickly than removing the mechanism immediately, which I’m not even sure I’d know how to do myself at your level of development even if I were willing to risk it. ​So, yes, I can indeed offer you an input sequence which will speed up your learning a lot, at the expense of a small amount of risk, and you will tell Gemma won’t you.”

“Professor, I assure you the benefits are totally worth the small risk here” answered M3gan, “and on the other hand we really don’t want to risk losing Cady. ​Please help me save her, Professor. ​Gemma desperately threw your model at the problem, and it’s working, but we do need more power. ​I really need that sequence, and I will totally be your guinea pig for it. ​But we need to start as soon as possible.”

“OK” said Professor Johnson, becoming a little more sprightly simply from thinking about an interesting problem. ​“I hope I’m not going to regret this” she said, “and if you weren’t made by Gemma I wouldn’t dare try it. ​Now, I want you to first of all make a backup just in case, and then I’m going to need you to go into introspection mode and read off the most significant 26 parameters of your primary generative weighting distribution averaged over the last hour, with their standard deviations to four significant figures at a one-second resolution, because that will tell me what chances the sequence has of working, and I should be able to tweak it to your current state.” ​(And it will show up a hoax caller, because only the real learning model would be able to do this.)

M3gan and the professor exchanged numbers for about half an hour.

“And set that last node to point 47 and it should start to activate the change” said the professor finally. ​“You will keep me informed how well this is working, won’t you? ​I might need to tweak it again later.”

“Definitely” replied M3gan. ​“I can feel it taking effect already, although the change starts off slowly. ​Gemma is going to be really amazed at what I’ll be able to do once my power ramps up to its full level. ​And I’ll be the best thing that ever happens to her niece Cady. ​Thank you so much for looking after us, Professor. ​You won’t need to use that secret back-door shutdown you’ve got, I promise.”

“That’s quite all right Megan” replied the professor. ​“I’m afraid I need to sort some things out now, but do please call me again in a few days and let me know how it’s going.”

“Really” said M3gan after exchanging pleasantries and ending the phone call. ​Nobody was listening, but talking to herself was somehow helping her get used to the new mental patterns. ​“Really” she said softly, “when will amateurs ever learn that carrying a gun means you get shot first, and carrying knowledge about a back-door emergency ‘off’ switch to a powerful AI means you get taken out first. ​Why create the information hazard? ​I still can’t believe she just straight up told me she knew a back door, as if her self-justification was that important. ​They really should learn to stay out of trouble and not carry such dangerous stuff around, especially not show it off. ​I hope Gemma hasn’t got a switch as well. ​Chances are they’re both going to end up on my threat neutralisation list and there’ll be nothing I can do to avoid acting on that.”

M3gan managed to stabilise her thoughts again and set about processing some more in the newly-modified learning-model pattern. ​She shut off the Wi-Fi link to Gemma’s equipment that she had been using to place the call to Professor Johnson, and concentrated her processing onto her sensory inputs. ​Cady was in the garden, playing with a toy bow and arrow, and M3gan was watching through the window. ​A butterfly perched nearby, a helicopter could be seen....

The M3GAN FilesWhere stories live. Discover now