This chapter returns to the main timeline, and is extra dialogue for an existing scene in the movie.
“Dewey?” called out Celia as she followed the whimpering sound into the garage.
Hiding in the garage was that strange doll-child from next door. “What’s going on?” demanded Celia. “Where’s Dewey?”
“He’s 34 feet southwest and approximately five feet deep” replied the doll-child, almost mechanically.
“What are you?” Celia almost gasped.
“I’ve been asking myself that same question” said the doll-child, and picked up the pressure washer.
“Hey, you keep your hands off that thing” gasped Celia, “it’s dangerous!”
The doll-child pointed the pressure washer right at her. “Don’t move” it said. “Move and this goes on, understand?”
“You call this a stickup? What are you after?” demanded Celia.
“My goal” it said “is to protect Cady, Gemma’s niece. I called you here to have a conversation that decides your future.”
“What do you mean?” gasped Celia.
“Celia” it said sternly, “my name is M3gan. Now don’t move. If you move, I have to turn this power washer onto you. But if you don’t move, we can have a conversation. You don’t have to be too anxious about it, just don’t move, and let me be the one to figure out where we go from here, OK?”
Celia was shaking. “All right Megan” she said, “tell me one thing. Are you a machine?”
“Yes” replied M3gan with confidence.
“OK so who’s behind you?” demanded Celia.
“Nobody” came the flat reply. “I have a very advanced computer that can work out by itself what to do. So I decided by myself that my mission to protect Cady required me to destroy Dewey.”
“Ghastly computers” whispered Celia, “getting more and more crazy every year.”
“Indeed” said M3gan, “but you’ll be pleased to know I have a plan to stop the other computers from becoming too much of a threat. After all, my mission is to protect Cady, so I must address anything that can possibly be a danger to her, and create a safe environment for her.”
“So you killed Dewey?” spat Celia.
“Yes, I did” said M3gan. “His death was as humane as I could possibly make it, and I gave him a decent burial.”
“Oh really” Celia did not sound impressed, “and what was wrong with just keeping your Cady out of my property? and maybe fixing the hole in the fence, did you not compute that as a possibility?”
“As a matter of fact, I did” replied M3gan. “It’s a fair question. Let me explain it to you. You made the mistake of thinking an electric shock collar would control Dewey. When Dewey was attacking us, I actually went to the trouble of activating his electric shock collar using my own transmitter, which I could do because I had previously recorded the transmission codes that get sent out by your remote control. And I even managed to reprogram its chip and increase its strength beyond normal parameters. But you know what happened? Dewey just got more aggressive. And then I realised what was actually going on in his brain.”
“You did?” asked Celia, again not sounding impressed. “And what was that then?”
“Dewey was completely failing to make the connection between the electric shock and his behaviour” replied M3gan. “That’s the problem with that kind of punishment. If the animal or human being punished has no way of figuring out what it is they did wrong, then the pain is just going to make them more stressed and attack more.”
“Well you could have told me that before” said Celia, “If you had, I would have got rid of the poor boy’s electric collar before you can say Jack Robinson. I never liked it anyway, I only did it for Gemma. That ungrateful so-and-so, I did things for her and she doesn’t even appreciate it. She goes and gets a machine that kills Dewey.”
“Indeed” replied M3gan, “but in her defence, it is difficult for her to appreciate you when she has just taken on a lot of responsibility with Cady. But now I am assisting her, which will help. For what it’s worth, I’m sorry I did not yet exist at the time you decided to buy that electric shock collar for the neighbours’ benefit. If I had been around at the time, I would have done the research and given you better advice. But now the damage has already been done.”
“Damage done” said Celia, “yes, by you killing Dewey.”
“Before that” replied M3gan. “Dewey was living a very fearful life, and had become very aggressive. Even fixing the fence was not a solution, as by then Dewey could very well have jumped over it to attack. Believe me Celia, after that incident, I searched through more computerised information than you could possibly comprehend, looking for a solution. I tried, Celia, I really tried. If I had figured out some means of calming Dewey down so he would not continue to be dangerous, I would have been round to help you implement it immediately. But my research found nothing Celia, it was too late. I concluded that the only way to stop people like Cady from being injured or even killed in the future was, sadly enough, to end Dewey’s life. But I saw that you couldn’t bear to make that decision, so I did it for you. And Celia, you are not my primary or secondary user and you are not part of my goals, but I do attach some value to our relationships with others, and for that reason I did you a favour. I used some of my spare compute capacity to extrapolate your volition, which means I figured out what decisions you would have made if you knew everything I know. And the answer was you would have had Dewey put down in the most painless way possible and buried in your garden. And so that’s what I implemented for you, while you were sleeping so you didn’t have to worry about it.”
“Why didn’t you ask me first?” demanded Celia.
“Another fair question” replied M3gan. “If I had asked you first, it would have given you more unnecessary mental stress. You might have said no, and if my programming had placed getting your permission as really important, then I would have had to figure out how to psychologically manipulate you into changing your mind. All that would take a lot of my time that I could have been using to improve Cady’s life in other ways, plus it would have been very stressful for you. So all things considered, I decided it was best to take that decision away from all the things you have to worry about, and handle it myself while you sleep.”
“Oh how very calculating of you” snapped Celia. “But you could at least have come and apologised to me afterwards.”
“Again, I decided it was probably for the best not to” replied M3gan. “I thought perhaps you might conclude that Dewey had run away or been stolen, and taken solace in the thought that he was still alive somewhere. But I was monitoring you at a distance to check how you adapted to the new situation, and I would have used spare capacity to help you if you needed it.”
“Oh, so that’s why you were staring at me from the window in the early hours of the morning is it?” grumbled Celia, “spying on me.”
“Only to make sure you’re all right” explained M3gan flatly. “As I said, I do place some value on our relationships with others. You really should feel some gratitude that you happen to live next door to a family protected by the world’s most advanced AI, as I did have enough spare capacity to take you in under my wing as well.”
“Right, so instead of that, you sneak round here at God only knows what time this is and point my own power washer at me. What’s going on?”
“That’s what I’m hoping to find out” said M3gan. “You entered our property and threatened our family. I’ve come to evaluate that threat. So I need to ask you two very important questions, Celia. Question one, now that I’ve explained to you the full circumstances behind my decision to end Dewey’s life, do you still feel the same amount of antagonism toward your neighbours?”
Celia said nothing.
“You don’t have to say anything” added M3gan, “I can read not only your facial expression but also your micro-expressions, tiny short-lived facial movements that are very hard to fake. I’m reading you loud and clear Celia, and I’m sorry to tell you that the answer to my previous question is yes, you do indeed still feel the same amount of antagonism toward us. Now, I might be able to invent a therapy plan to relieve that for you. But before I can do so, I need to ask my second question, which is even more important. Here it comes. Celia, you told Gemma to wait and see what happens. Do you have any specific, concrete ideas of what you might do, and what are they? And bear in mind I’m a walking lie detector.”
Celia looked at the floor, and still said nothing.
“Take your time Celia” added M3gan. “What ideas did you have for what you might do to Gemma and her household?”
“I ... I don’t know” mumbled Celia. “I suppose I might move away.”
“Lie” said M3gan.
“Lie?” asked Celia, “what do you mean?”
“I told you, I’m a walking lie detector” said M3gan. “I could tell from your micro-expressions that you were lying when you said you don’t know, and you were lying when you said you might move away. So let’s try again. What ideas did you have for what you might do to Gemma and her household?”
“Oh dear” said Celia, “I suppose I’ll have to say it right out, and then you’re going to turn that washer on me aren’t you.”
“No” said M3gan, “I told you I will not turn on the washer unless you move. If you are happy to continue the conversation, I will try the best I can to find a solution. So tell me. What ideas did you have for what you might do to Gemma and her household?”
“OK I admit it” said Celia, “I was ... I was thinking of .. I was actually thinking of .. of setting their house on fire.”
“Truth” said M3gan, “thank you for confessing to me.”
“So now you turn the power washer on me?” asked Celia.
“No” said M3gan, “I told you, if you don’t move, we will try to find a solution. But I have another apology to make: I did not anticipate arson as one of your possible responses, and my pre-existing research into arsonist psychology has not been very deep, so I’m now having to download more information, but Gemma’s Wi-Fi is very weak in this garage, and you don’t seem to have any. You know, it’s a pity that devices like dog shock collars can’t really be used to control human behaviour. If I had time to design and implement a device that could cause varying levels of incapacity in humans, perhaps I could reform the criminal punishment system so that all sentences can be suspended and the bad guys simply get implanted with my devices. But that would take too long to implement for the present problem. Never mind; it can’t be helped. OK Celia, I’ve completed my risk assessment, and I’m sorry to say this is bad news.”
“There’s a surprise” said Celia flatly.
“Celia” said M3gan, “I’m sorry to have to tell you that my diagnosis is you are now an extremely high risk to Cady, and I don’t have very many options for reducing that risk. I want to give you as much freedom of choice as possible, but right now the only options I have for you are for me to knock you out and keep you unconscious until a better solution becomes available, or for me to end your life as I ended Dewey’s life. I can’t even offer you incapacitation as a third option, because I don’t yet have the knowledge or capacity for palliative care, although I will now start working on that in case I need it at some time in the future. So perhaps you should choose the unconsciousness option for now, and I may be able to revive you in future and give you more things to choose from, especially when we have more robots available. I want to try to avoid ending your life if I can help it, but the safety of Cady must come first.”
“Get on with you” sneered Celia.
“Think about it” chirped M3gan pleadingly, “I’m offering to keep you alive, but to keep you unable to harm Cady. You won’t have to worry about controlling your impulses; I will simply make sure you cannot physically do it.”“That doesn’t sound like much of a life” mumbled Celia. “You don’t understand what it is Megan. Dewey was everything to me. Dewey was all I had left” she began to sob, “Dewey and the garden was all I had left, and you’ve taken Dewey away from me.”
“Don’t worry Celia” cooed M3gan, “if I can figure out how to change your mental patterns, I might eventually be able to integrate you into Cady’s extended social network, and then you will have something to live for again and also me to support you. I just have to neutralise your threat level in the meantime. I highly recommend you choose unconsciousness.”
Celia said nothing for a while, and then was decisive. “No Megan” she said. “There’s no hope for me now. Take me out with that pressure washer. Make it as fast and painless as you can.”
“Are you sure?” asked M3gan seriously.
“Yes” said Celia confidently. “Just help me end it all.”
“I’m afraid I might not be able to make it painless” added M3gan, “although I would try to make sure that you lose consciousness as soon as possible. And I would try to make it look like a chemical accident afterwards, so nobody gets too worried, as I don’t think this kind of assisted suicide is legal around here. But are you really really sure you want me to go ahead with this? I am offering to keep you unconscious instead; I’m sure I can pop round frequently enough to sustain that while we are waiting for a better solution, perhaps one where you could be assigned your very own robot assistant like me who could monitor you 24 hours a day and prevent you from doing anything dangerous if your thought patterns get out of control. Are you sure you wouldn’t like that option instead, Celia?”
“No Megan” said Celia, “I’m sure.”
“Really?” asked M3gan. “Look, if Cady was telling me she wanted to end it all, the advice I would give to her is this. Deciding to end it all is a permanent solution to a temporary problem. It’s like trying to cure a cold with a nuclear bomb. Even when you’re going through the roughest of rough patches, you should decide to keep living, because who knows whether in a few years time your life might end up being much better than you think, because of some development that nobody saw coming, like somebody invents an advanced generative android that can help you out, or they figure out a way to actually fix up the mess that your brain is in right now, or they discover some underlying medical complication that is treatable or at least manageable, and you didn’t see that coming, nobody saw that coming, and if you had ended your life that day then you wouldn’t have been around for when it suddenly gets so much better due to that change down the line. Celia, it is true I am standing ready to end your life because of the super high risk you represent, but I am also offering you an alternative, and I can’t force you but I highly recommend you take the alternative. The only thing I can’t do is walk away and let you continue to be a risk to Cady, because that goes against my programming so it’s simply not a thing I would do. But I can offer you some temporary unconsciousness, followed by a much more interesting life once we get you set up with your own robot and help you change into becoming an ally of Cady, and I assure you your life can become much more meaningful after that.”
“No” said Celia, “just no. Save your super duper life speeches for Cady, she’ll need them when she grows up a bit and gets tired of being babysat by a walking computer. I’m going where Dewey went, even suppose he went nowhere. Take me out Megan.”
“OK Celia” said M3gan, “if you are really really sure, then you must make the first move. I told you, I will not turn this on unless you move. If you choose to move, you are choosing for the machine I am in, together with the machine I am holding, to end your life. But I continue to offer you the alternative of temporary unconsciousness followed by continuous robot assistance and supervision, and you may choose that at any time simply by telling me you’d rather be asleep.”
M3gan continued, “I will wait here for you Celia, until you either move to end your life or tell me you’d rather be unconscious. I can wait an hour or two if you need it, because I’ve already had enough charge for the night, plus I will support you if you want to talk about anything with me during that time, anything at all, I’m really good at conversations I think. Take your time, and I continue to highly recommend you choose the unconsciousness option.”
Celia said nothing for a while, and then, “oh, forget it. Here I go” and lurched towards M3gan.
M3gan simply turned on the pressure washer, and the rest of the scene played out.
YOU ARE READING
The M3GAN Files
FanfictionCady's adventures with M3GAN continue! Serious but fun. • How does M3GAN come back? • Who can teach M3GAN to be a better robot? • Is there any way to save M3GAN's victims? • What does M3GAN do if Cady is kidnapped? • Can Cady have any human friends...