
Location: C.A.G.E. transport
Outfit: School uniform
Skills/Powers/Equipment: N/A
When C.A.G.E (the California Asylum for the Genetically Enhanced) was first conceived it was meant to be a powerhouse to hold those with superhuman abilities who the public deemed as âunstableâ. Thankfully those with training in the mental health fields stepped in and persuaded them to move from the prison concept to an actual facility to help those powered with their mental health struggles.
When one heard asylum, typically rooms with white walls covered with soft surfaces and straight jackets were pictured. The goal of CAGE was to do away with that imagery. CAGE had four floors. The top was where the main offices were and the various other departments needed to help a business (IT, Accounts, HR, etc). The third was for adult patients, the second for children and teenagers, and the first had a cafe, a cafeteria, a gym, a pool, a garden, an outdoor seating area, a game room, a TV room, and a visitation center. There were also some underground floors where security resided as well as some testing centers, but there still remained some unknowns about it.
For safety reasons, each room held protection against those with powers. Power dampeners were on which prevented power usage. Any attempt alerted security and staff. There were security checks on each floor. Every patient was monitored with security cameras, some obvious, some not so much. However, restraints were used as infrequently as possible.
Dr. Maeve Whitehall was one of the psychologists on staff. Though she worked primarily outside of CAGE, she held some hours there when it was called for. She was generally well-liked amongst the staff. Her patients tended to trust her and she did her best to provide a welcoming persona to those that were here. She rarely ever had confrontations with patients, though when it was called for, she was more than capable of protecting herself. Even then, she did her best to not hurt anyone. Sometimes patients couldnât help themselves. She understood that.
With her connection to Avengers Academy she sometimes had to make the call to have students and staff alike sent here and today was no different. Coulson had made the call and she confirmed his concerns. Two students had met the qualifications of being petitioned for inpatient treatment.
This was where Victoria van Dyne found herself. How she got here, Maeve did not know, but she could hazard a guess. Either way, Victoria found herself in a single room. It was spacious enough. Each room had a queen-sized bed, a TV, a small sitting area with a sofa and an end table, a desk, a dresser, their own bathroom with a shower, sink, toilet, and closet. There was medical equipment handy if need be, such as for medications or health concerns. Power dampeners were hidden within the walls though they could be felt. There were security cameras in the main room but not in the bathroom and they could be turned off if staff wanted to, but only for 15-minute increments.
Maeve had read the file on Victoria as well as Coulsonâs additional notes on why he thought this would be best for her. Maeve knocked on the door to Victoriaâs room and stepped in.
âHello Victoria. My name is Dr. Maeve Whitehall. I am a clinical therapist here at C.A.G.E. I imagine you have questions and I am happy to answer any and all, but before we get to those let me ask: how are you feeling? Are you in any pain?âFor once, Victoria thought Coulson did something right, sending them here. When she arrived, she checked the facility out, and it seemed a lot better prepared than the clinic she was visiting before. The amount of EM radiation she was detecting suggested the place was more wired than a SHIELD prison, but she wasnât bothered overly much.
She was shown to a room when she arrived, and spent the little time before someone came looking out the window, quite enjoying a moment with nothing to do, until the doctor arrived.
âGood morning, Dr. Whitehall. Thank you for seeing me.â she said as she turned around to face the older woman.
âIâm not sure how much you were told about me, but Iâm an AI in an android shell, so I donât really do pain. Thereâs some damage to my hardware, some of it that I do not repair on purpose, although I expect weâll get to that later.â She motioned to a stack of paper on the table,
âI brought my documentation from my previous therapist, if it would be of use to you.â Personally, Victoria would not pollute her own initial research into something with the thoughts of others, but she knew exactly nothing about the medical profession, so she said nothing.
âYes, I am aware of who you are. I would include damage to your hardware as pain, though your pain tolerance, such as it is, is your own. Thank you for the paperwork and for being accommodating. I was told you came willingly, which is good to hear. Things are done a little different here compared to a therapy session at your school, but much remains the same in terms of confidentiality. Now, we can have our initial session here or we can go somewhere more private and comfortable. Your choice.âVictoria seemed to think about the analogy for a second.
âThen you would be incorrect. Pain is a signal from the nerve endings to the central nervous system about something being wrong. The proper counterpart to those would be signals from my diagnostics sensors, but those end as soon as they are logged. I believe a better analogy to the damage I have would be âbeing injuredâ, not âbeing in painâ. You know, if you need to anthropomorphize me.â Victoria shrugged, not really keen on that idea. The last therapist treated her as they would a human, and it didnât seem to work too well.
âIn any event, this room will do fine.ââAs I said Victoria, your pain tolerance is your own to decide. If that is how you wish to see it then that is what it will be. Surely then you can allow me an answer to the question of your pain then if we are to make the correlation to you being injured. If you are uninjured then that is the case. Though that mark of pain is only physical. What of mental pain? Emotional pain? Do you have any of that?âMaeve moved to the sitting area in the room and sat down, gesturing for Victoria to follow and sit, if she so desired.
âI do not intend to treat you as something other than a person. Whatever is going inside you matters little to me in the context of providing you care. You are Victoria while you are under my care, no more, no less. Now, tell me in your own words why you are here and what, if anything, you hope to get from this experience, such as it is.â Maeveâs words could be seen as curt, and to some extent they were, though there was still heart behind it. Even a person who came willingly put up some fight, some battle. Sometimes it was even placating her or other staff in order to get out quicker. Sometimes it was because no one before had showed them care so any attempt was seen as fake.
Whatever the case, Maeve was here.
âDoesnât it matter though?â Victoria asked as she chose not to follow the Skywalker way and took the seat.
âYou donât make special arrangements if a patient with a brain injury or genetic condition comes in? That seems almost⊠careless. Anyway, I think âHow I tickâ is the root of half of my problems, including the one the Headmaster thinks I should be here for.â she continued explaining.
âMental pain⊠I suppose that would be a good label for some of it. Some other issues are a bit more philosophical than psychological in nature, I think.â she started, taking a moment to organize her own thoughts.
âThe biggest difference between you and me is not that I am mechanical and you are biological. The biggest difference is that I have been made with a purpose, which I am aware of. I was made to protect regular humans from all threats superpowered. I was always aligned with that goal, believing I chose it myself, or at least agreed to it. But, I am not even sure anymore I have the free will to reject it. I know for a fact that certain people can give me instructions I cannot disobey. I know of one such instruction, which has since been removed, but how do I know I am not a modified three laws robot? I may not be logically capable of abandoning my purpose. The only way to know for sure is to watch as someone gets killed and intentionally do nothing about it, which is not on the table. And if my will is not my own? That terrifies me.â She gave a dry, humorless chuckle.
âAnd weâre only at the start.ââI said you were Victoria under my care. That statement, to me, doesnât signify any less care or that it doesnât matter. You are different than every other person in here, but so are they. So I prefer not to treat you as the thing you were made but as the person you have and are becoming.â
âSo it sounds like the programming initially put inside of you has shifted. And that comes with its own set of worries, fears, and stressors. I also imagine this is new territory for you. As you said, thereâs a way to test the theory but to do so would go againstâŠ.huh I guess it wouldnât be your programming then, right? To let a human die to test that theory appears wrong.â
âBut, and I do have to ask this question, do you want to kill yourself? That is the statement that started all of this. Do you still feel that way?âVictoria sighed.
âYouâre still missing another part of the puzzle for that.â she shared.
âMy personality and memories are an⊠add-on would be the best word. My core, a Windows to my PC if you will, is a rather homicidal AI. I, as the governing personality, have the final authority on deciding what actions we take, but the AI is still present. And this year alone, it has already almost been released from this confinement twice. The first time, I had to destroy my antenna, which is that injury we were talking about earlier. And since I have not yet developed countermeasures to prevent it from happening again, I choose not to repair that damage.â
âThe second time is what you are talking about. A teammate, Edward Arca, a right scumbag hiding under my nose, has figured me out for a robot and managed to infect me with malware that gave him total control of my actions through giving instructions directly to that AI at my core, and removing my decision authority. And he put in one, simple command: Kill.â She reflexively pulled her arms tight around her at having to think about that again.
âI-âShe could not look the doctor in the eyes talking about this, and averted her gaze.
âI donât think I ever felt more violated in my life. Not even when another teammate was patching my software core up after the earlier case. Anyway⊠At the time, all I had left was reasoning with my core AI. I tried to get it to be threatening to the least amount of people. I tried to get it to use the same attack patterns to be predictable and easier to avoid. And then, I tried to reason that the easiest way to fulfill that âkillâ command is to kill ourselves, because at that point, one of my classmates was on their last legs. Only it didnât agree, and I could only watch as I wiped one of my classmates out.âShe got up, pacing across the room. She should not have to explain this. Her actions were logical, and no one presented any argument to the contrary.
And if they did, they would be wrong. she thought.
âIf that AI inside me ever gets out, it will be terrible. So, every time that threat rears its head, I have to make a risk assessment. In the earlier case, I was being hacked remotely, from the outside. Disabling my antenna was the optimal solution to stop it.â
âIn the second case, I was already compromised, and if I made it out of the training simulator in that state, it may well have all been over for humanity. Likewise, if it was a choice between one of my classmates dying and me offlining, I am still the better choice. If I go down, I can be reactivated with my memories intact, although my personality would revert to that of a newborn. It would not have been me, but the new iteration would have at least remembered me. Same canât be said for people made of flesh and blood.â
âSo, with the information I had available, offlining myself was again the optimum solution. Did I want to kill myself? No. Have I made that decision anyway? Yes. Do I still want to do it? No. The situation is now vastly different and the benefit humanity may gain from my continued existence at this time outweighs the risks. May I arrive at the same conclusion again in the future? Yes. I have to. The alternative is unthinkable. And I donât think there is anything you can do to convince me otherwise, save for getting my creator to remove that AI and replace it with a different operating system. This is not suicidal thinking in my eyes. Itâs pragmatism.â she finished the exposition, sitting back down.
âAnd yet, here you are. Sure, it was suggested you come here, and you came willingly. Even with paperwork in hand. But no one is truly holding you here, if that is the case. I imagine this AI or whatever internal or external forces at work could think of many ways to get out of here. Some may even involve hurting other people, myself included.â
âIf you are going against what this AI wants from you then that is free will. Itâs free will in a different shell, sure. I can gather many clients who talk in a similar way for various reasons: a god is telling them to do something, they hear voices, they feel a pull to perform actions to varying degrees of inhumane behavior. Your case is different, sure, but it isnât as different as others may think.â
âSo, what do you want out of this Victoria? Do you want help going against this programming? Do you want me to seek out this creator and get them involved? In these instances, you have choice and can decide for yourself. I wonât do anything that goes against your wishes unless I feel you are a danger to yourself or others and right now I do not believe you are.âVictoria snapped her fingers into finger guns pointing at the doctor.
âBingo, doctor.â She smiled.
âRight now, I am not a danger. But it all hangs on one thing: Me continuously telling that AI âNoâ. If something changes my mind, weâre in trouble. Now, I can only prepare contingencies for viruses, hacks, or some powers. Two of my classmates at least can still make me dance to their tune with a single thought. Little you can help with. But lately I have grown increasingly worried that I may make that decision on my own.â
âI⊠The way my memory works, I will never forget anything. It is a holographic storage. Even if you break it in half, both parts still contain all of the information, you just have a smaller window to view it through. So the more dark, twisted stuff I see⊠Iâm worried Iâll just snap somewhere down the line. What I want, what I need to do here is to learn how to grow more resilient to it, or at least to evaluate if I am closing on that moment. Itâs been over a week now and my hands still wonât stop shaking whenever I think of what happened recently. And donât get me started on how furious I am with my creator! She makes me with the purpose to sacrifice myself so that others donât have to, puts this awful AI at my core, and then she acts like my mother? The fucking nerve she has! Only thing I want to get out of that person is to get emancipated! But as far as I know, the courts wonât hear of that without her consent.â Victoria vented.
âWe donât have to discuss your creator, your âmotherâ as you referred to her, if you do not want to, though I would argue and strongly suggest we do at some point, if for no other reason than to process it so you can work through it and come out the better. Many people struggle with their relationship with their parents, and yes, I recognize this is a different thing for you entirely.â
âNot forgetting anything can be hard, I imagine. A typical human mind will often block out traumatic events as a defense mechanism, though whispers still remain, like someone panicking when they hear a song and are unsure why. For you, itâs different. You canât forget so you must make do with that and power through it. I want to circle back to what you said before. About how your hands still shake with what happened recently. Can you tell me more about that?ââAgreed, but, one thing at a time then.â Victoria sighed, settling down into the chair.
âIt was during the second time I was hacked. Arcade was part of my team. Worse, the teacher that was assigned as our mentor was his creation. I suspect that through that puppet, or our training sessions, he found out what I am. When he trapped us in our training simulator, he had a special trap ready for me. HeâŠâ Victoria paused.
How
did Edward manage to take over her system so quickly? She closed her eyes while diving into her memories and logs. Even the Ultron in the Haunted house took
minutes to take most, but not all of her core firewalls down. Edward Arca was sneaky and capable, but Victoria had trouble believing he was that good, even if the imperfect simulation of her system in the Framework gave him an edge. It took some digging, but she found one log that was a damning piece of evidence. She found a timestamp of when the last admin order was given to her. It should have shown a date in the 2020s. Instead, the record said November 23rd, 2038.
Victoria ran a hand through her hair, wondering whether to fly off the handle or laugh hysterically. The second variant won in the end.
âHe used that fucking admin backdoor! He put a command on me, forcing me to try and kill the others. I was trapped in a sub-simulation with three other classmates - Iâm not sure I should mention their names. I tried⊠I tried to give them the best chance. I bullshited my core AI to use predictable moves. But it wasn't enough. When one of them tripped, thatâs when I tried to offline myself, before my attack could hit them. But the core AI had none of it. I-âThere was that shaking in her hands again. Victoria clenched her fists in an attempt to stop it.
âHe made me kill her. My failure to keep the cage on that thing I am built around closed was the death of someone else. She came back to life since, through her own powers, and she says she doesnât blame me for it. But I donât think Iâll forgive myself anytime soon.â By the time she was done explaining, her voice was barely above a whisper.
âWhen we hurt others, even accidentally, and even when we do all we can to stop it, it can be difficult. It sounds like none of it was your doing and you did all you could to stop it, but the feelings still remain; that feeling of guilt, blame, shame, anger, sadness. The cumulation of all of it. Thereâs no wrong way to feel about it.â
âI am going to tailor my approach with you differently than I would in other cases. I think this will work for you to overcome your programming and fight off the instinct when someone does try to override you. I would like to rope in some experts with AI and robotics, if you do not mind. I wonât discuss with them what we talk about but Iâll admit my own shortcomings when it comes to those fields and I would feel better if I had help behind me. If thatâs all right with you. If not, I can figure something else out.âVictoria considered it for a good, long while, running simulations and risk assessments.
âI have some conditions. First, I get to approve whoever is involved. Next, no one gets my complete specs. There is already one more ASTRA running around than there should have ever been. Then, no one does any code modifications that I donât approve. And lastly, Hope van Dyne, my creator. If you feel it is absolutely necessary to involve her, I donât want to meet her, not even a chance encounter, unless the backdoor in my system is disabled. Otherwise⊠Youâre the expert. Iâll follow your advice.â she nodded.
Something the doctor said was bugging her a little.
Overcoming her programming. The thought felt like exactly what would lead to her becoming what she feared. Sure, the admin override needed to go, but some of the other tenets of her program felt like solid measures.
âIn fact⊠If she could be convinced to replace the AI in my core with something less dangerous, or even just removing it and letting me build my own driversâŠâ she thought out loud,
âIâd take literally having to learn to walk again, if thatâs the price for that thing being gone.ââI can agree to those terms. I would like to speak to Hope on my own, if that is okay. Iâll even ensure sheâs as far away from you as can be. Everything else should be fine then. Iâll have a list of potentials with their qualifications and you can let me know who you think would work best. In the meantime, letâs end things here. Please call for me if you need anything. One of the nurses should be able to reach me if need be. Otherwise, feel free to tour the place, meet the other people here if you want.ââI will, thank you. Do you think I can order a delivery of books here? I have a certain author I need to read.â Victoria asked, wondering whether she could cram both the robot series and the foundation into two weeks.