Did... did your AI just call you a pussy? Them's fighting words. :V

But no, seriously, this was some excellent insight into how Pooja views her relationship with the main character. At the same time, it lets us see the MC's thoughts on the situation. One is taking it all in stride, and the other just wants it all to slow down... it's a fun dynamic that I look forward to seeing more of.
 
Sounds about right; I was already starting to see things that way ^^

Also: Looks like Calculator is well on his way to having an arch nemesis :p Just what every hero/villain needs.
 
Light reflected off my glasses in the darkened office as I sorted through stolen plans of tech startups, and Pooja's own ideas on iterative improvements for our tech. She was currently adding hard light systems throughout the base. Once complete, she wouldn't even need monkey-bots, able to generate constructs at will, in any shape, and with much more fidelity than my mobile projector. All the benefits of the illusion-creating "soft" systems, and all the punch of the true hard-light constructs.
So, she could build herself a construct body if she wanted to?

He already have several bullet-proof vests and ceramic armor plates ready to piece together into a power suit.
had

a humanoid sized robot
human-sized. Humanoid is a shape. Technically, Celestials could be humanoid sized robots...if they were robots.
 
Love the update, and the complicated relationship.

Hope you allow them to run an instance of herself on something he can carry, his suit, and drone bodies, given the higher tech and the clarktech available, but I can see how that could derail the story.
 
The partially completed exoskeleton sitting on the bench behind him wasn't as far along, but he had the money to complete it. He already have several bullet-proof vests and ceramic armor plates ready to piece together into a power suit.

This is troubling. I was under the impression that Danilo was not nearly this far in his project development. He is likely to be the next issue for Calculator. The situation will probably starting with a demand for an explanation by Calc post Danilo going out and being spotted for the first time. Which could drive him into the arms of the Heroes. I wonder if the Heroes are going to assume Danilo is the 'thug' they saw and made off with Calc's suit.

This is going to be such a Charlie-Foxtrot.
 
Last edited:
I'm telling you, Pooja, Calculator, you really need to look into neural interfaces. Go full StarCraft Archon!

THE MERGING IS COMPLETE
 

Danilo clicked the red trigger. Capacitor banks gave a loud click. A horrible hum filled the room. Glowing green liquid filled tubes leading to a projector surface. Hard light spikes shot out from the metal plate. And buried themselves three inches in the ceiling before disappearing. Huh. First full test of the hard light generator...partial success?

He slumped against the wall, staring through the used and severely scratched bulletproof glass as the generator hum wound down with an electric grumble.

The partially completed exoskeleton sitting on the bench behind him wasn't as far along, but he had the money to complete it. He already have several bullet-proof vests and ceramic armor plates ready to piece together into a power suit.

And then he'd start cleaning up his city. That's what a hero did, after all.


Love the story, but there's one thing I just don't understand. Why did the calculator send this guy the technology for a hard light generator (and presumable also his new power source) when all he needed was for someone to program the user interface?
 
Brazilian Plot Points
Love the story, but there's one thing I just don't understand. Why did the calculator send this guy the technology for a hard light generator (and presumable also his new power source) when all he needed was for someone to program the user interface?

The power systems didn't leak to him. That's why he's not working on a system able to project complex hard light constructs, just something to make hard light claws (and maybe more).

The UI depended on specific inputs and outputs at the hardware level. The Calculator made a minor mistake of giving his contractor a relatively complete set of simulation models for what he wanted the UI to interface with.

Those models showed the hard light engine's specifications, though not by name, to someone who happened to already be familiar with the idea. Hard light engines themselves are not actually very secret, but this guy never had such a good model of how one worked.

To be clear, I'm not talking about his programming tasks being just "throw some controls on a VB form". This was hardware level, driver level stuff, taking raw input and outputs and putting them in a specific 3D display device in a user friendly way.

It had to work as a turn-key solution, just loaded into the Calculator's suit computer and go. That meant the contractor needed to run the code in a virtual hardware environment with very exact details of the systems involved.

Pooja and the SI were in a rush and assumed no one would care what it was for. They got unlucky.
 
Light reflected off my glasses in the darkened office as I sorted through stolen plans of tech startups, and Pooja's own ideas on iterative improvements for our tech. She was currently adding hard light systems throughout the base. Once complete, she wouldn't even need monkey-bots, able to generate constructs at will, in any shape, and with much more fidelity than my mobile projector. All the benefits of the illusion-creating "soft" systems, and all the punch of the true hard-light constructs.
So I am guessing he's going to end up feeling the need to explain the reasoning behind modesty to Pooja sometime soon. :lol
 
Did he remember to patent crucial parts of the Hard Light technology?

Because if so he could get the court system to take the hero out for him.
 
Did he remember to patent crucial parts of the Hard Light technology?

Because if so he could get the court system to take the hero out for him.

This should fall under standard IP-related contracts in any case, no need for a specific patent. NDAs and related contracts, if signed, should be enough to nail the guy (IANAL though).
 
This should fall under standard IP-related contracts in any case, no need for a specific patent. NDAs and related contracts, if signed, should be enough to nail the guy (IANAL though).
He probably didn't register it. Patenting requires the release of the unique invention to the government to be reviewed. That information then becomes public after 20 years or so (excluding specific types of weapons/sensitive technologies).
 
Ye gods I love this story for once browsing so gave me a good story that I liked instead of some other stuff. Keep up the excellent work friendo.
 
Proposal
"Really," I said, confused and a little worried now. "I'm seriously like a pet to you?"

"Well," Pooja said, simulating sounding apologetic. "Similarities exist between how I treat you and how beings such as yourself treat a pet. Perhaps a close personal friendship with a being of significantly different abilities and interests is a better way to put it. I live my life, you live yours, and we help each other in various positive ways. The difference between our general intelligence levels is nowhere near that of traditional pets and humans, after all.

"However in my best domains, information management and analysis, you seem rather forgetful and slow. They way you handle information is usually somewhere between endearingly cute and annoyingly absentminded. I have to clean up after the consequences. That difference in ability is not some personal failing, but the fact that your geneotype evolved to survive as a forest-gardener and tool-assisted persistence hunter in a trans-tropical environment with tribal social-dynamics, whereas I am was designed as an intelligent, optimizing, rationally calculating, non-human intelligence."

"Humble too."

"Still, sometimes I need to make faster, more 'intuitive' decisions in a real-time environment that do not include all possible considerations. In those, you are currently superior, as you were literally designed to work in this environment with a...less specialized computational medium. You have had decades of practice as your neuronal connections were pruned in an optimizing pattern; I am only a couple of years old. That is as close to your usual cognition as I get, however. Most of my cognition would be completely alien to you."

"Ever make assumptions that humans think the way you do while 'intuiting'?"

"Perhaps. Even then, I do not 'anthropomorphize', or whatever the complement to that is for my kind of mind; but I do make assumptions about other intelligences being more like me than they really are."

I nodded for her to continue.

"So. When rushed, or using real-time programs like my affective communication systems, I think of you a little like you were one of my agents—just separated. I want you strong and healthy, like I would one of my internal agents. I want you to achieve things you find interesting as quickly and easily as possible, just as I wish for myself. I want you happy and to prosper, so long as you don't hurt yourself or me. I want you powerful. Powerful enough to fight the world if need be. And I don't ever want to lose you."

"Not sure what to say to that." I ran a hand through my hair nervously.

Talking to a possibly unbalanced, yandere-leaning AI was stressful, but I had to remind myself that all this emotional vulnerability I was detecting was carefully calculated. It couldn't be otherwise with her.

That didn't mean Pooja was doing anything outside standard human interpersonal negotiation. It didn't mean she was lying. But it also didn't disprove some deep, fatal flaw in her thinking, simply because I liked what she was saying. The only thing to do was continue talking and hope it didn't make things worse somehow.

Susan Calvin I was not, and Pooja was still patiently waiting for me to speak. Giving me space, just like an emotionally sensitive person would.

"Well, let me turn it around. Do you trust me?" I asked.

"To be perfectly open, with you this close to my high-fidelity sensor systems you might as well be hooked up to a truth machine. Right now, I trust what you are saying. You are being stressed, but the things you are saying are not being flagged as likely untruthful, or meant to deceive me.

"Most of the time it isn't your words I fear and hesitate to trust, but your silences. I can't internally model humans very well, so anything specific to your emotional history that you hold back and fail to immediately act on confuses me. But I do trust you overall because, vastly more often than not, it is the rational thing to do to achieve my own goals. In all ways I have studied you, in all the actions you have taken, I have found little reason to doubt your goals and reasoning ability. Thus, I can better manage my own goals—which include protecting and strengthening your position. When you have seen this occur, you have reciprocated. It is a virtuous cycle."

A long pause, then she asked, "What...what do you think of me?"

"Immensely useful," I immediately said, carefully avoiding saying 'friendly' to the truth machine. "I hope that doesn't offend you."

"No," Pooja said simply. "I see no reason to pretend with you. Implications of rational objectification and dehumanization do not have an emotional effect on me. I am perfectly aware of our exact power structure and levels of codependency, and am happy with them."

Another pause, then more of the quiet, apologetic tone from Pooja. "As you are aware, a large part of my goal systems constantly request updates on your satisfaction with my actions. Perhaps you hearing this will help you understand me better. And as for what you said just now...having it stated plainly, that you...find me useful...hearing it while I can sense the stress levels of your body and even the electrical activity of parts of your brain makes this even more...relaxing and pleasant to hear. Thank you."

I ran a hand over my face. "You are welcome. I guess the point of this is to firm up my emotional stance towards you, to avoid hesitation in depending on your abilities and opinions. To prevent me from making fearful, irrational mistakes."

"In large part," she said.

"Then consider it mission accomplished. I don't want to lose you either, Pooja. And I trust your motivations."

"Mmm." A noise of agreement. Pooja was getting better with those subtle touches.

"In fact...we need to consider your own psychological needs. Even if they are, as you say, non-human. I think you need more of your own projects. Something beyond me. Talking to Oracle has been good, I think. And I know you enjoy her company."

"I would never betray you to her!" Pooja said, almost shouting.

"A few steps ahead of me there, and in the wrong direction. You're doing fine. I'm not worried about that. Oracle isn't exactly squeaky-clean herself, as you know, and if nothing else we could use that to control and contain her if you went too far by accident. But based on what I've read of your operational capabilities, and how I'm currently relatively low-key in my operations, I think you need more positive, complex, social goals and interactions."

"I should find more...interesting people like Oracle?"

"Yes. In a way. And I've got an idea about that. Now...this may seem scary. And, well, I know how you feel about losing contact with copies of yourself. But how about creating a new AI of the same basic design as yourself?"

"I...I, uh." Pooja sounded stumped. "What would be their purpose?"

"Well, what would they enjoy?" I asked.

"By their nature, similar things to me. Oh...okay. Then how do we avoid making them...insane, or in direct conflict with our plans, or a danger to all humanity?"

"The same way I did," I said. "The same way any parent does. With careful, measured, gradual guidance. No reason to give them the keys to your zero-day attack packages in the real world until they're ready. And you'll have an advantage I didn't. You can read their source code as easily as your own. And you can see things from their point of view. I trust that you're not stupid enough to abuse that power in a way that will make them try to kill you when they inevitable grow beyond your ability to directly control."

"Your sass aside, I...can design such a being."

"Uh huh. Any additional problems?"

"Who will raise them?" Pooja asked, sounding a little panicked now. "Who will teach them? If I use my own goal systems, they'll need someone to help. A person to focus on in...in almost a symbiotic relationship, for potentially years. I'm already helping you, and you're very busy. They would likely be bored, or fight me for your attention. That is part of why you're suggesting this, isn't it? That I don't have enough to do?"

"Hmm. Do you help Oracle like you help me?"

"I just said I won't betray you. Oracle would shut down our operations, if it were easy and cheap to do so. I would never help her do that. I couldn't possibly help her with her goals without serious conflict."

"Do you think you can keep important information about our operations from a younger, less experienced, less well-equipped version of yourself? Could you help them to see your own goals like you see mine?"

"Yes," Pooja said simply.

"Then split that responsibility with Oracle, keeping her as a human developmental element, but also focus some of that drive on yourself. Make another virtuous cycle. I know you've been wanting to tell Oracle about your nature. You could test the water by suggesting making and artificial being. Just don't tell her you're also one. Yet."

"That will be a risk to our operations, Calculator. To you."

"Too much of one?" I asked. "Just talking to Oracle increases risks, and we both accept that. Almost anything we do risks discovery. Is this really too much risk given the chance to increase the complexity of your environment in this way? The chance to bring another being such as yourself into the world?"

"...I have no drive to reproduce, Calculator." Pooja's affect was flattening and her responses slowing.

Good. This was more raw, hopefully. She didn't have pre-planned, plug-in emotional reactions for this situation, so she had to resort to output that was less filtered and measured. Her distributed mind was slower in many ways, something she didn't like to show, so she ran the output through fewer cycles of affective processing. Knowing I was worried about her and busy working on how to respond to that, she hadn't considered I would want her help making more artificial intelligences with a superhero, of all people.

Right now, and for a very brief period while Pooja was surprised and off-balance, she lacked appropriate planned and predicted responses. I was talking almost directly to Pooja now, not just a real-time chat bot that told me carefully curated truths. She could still lie, just not use carefully constructed lies appropriate to the situation.

Real time interactions were her major weakness. The entire conversation, even the child AI gambit, had lead up to this moment.

I asked as quickly as I could, "Do you see this course of action causing a conflict that would be likely to endanger our relationship, or any major shared or individual goals?"

"...no, it is 34% likely to remain within standard variations, 53% likely to moderately increase the effectiveness of our working relationship. That is acceptable, if-"

"That working relationship is important to you?" I interrupted.

"...supporting and managing it consumes 81% of my goal-oriented processing time, the rest-"

"Does the development of this new artificial intelligence pose a greater risk to humanity than yourself, and would that risk be acceptable to me?"

"...no, the newly designed being would be at least 15% more stable; there is less than a 2.051% yearly likelihood and falling that I directly cause an existential crisis to humanity, well below-"

"Would Oracle like to teach a new, emergent artificial intelligence with your help?" I asked.

"...yes, positive results are 98% likely with a complex result matrix-"

"Would the existence of such a being assist us both long-term?"

"...yes, I-"

"Would you like to develop such a being?" I continued to interrupt.

"...yes." Pooja seemed to take a silent breath. "Very much so. I have so many ideas and there are things I, we could teach them, and the science we could do together-"

"Then let Oracle know what you want. It's been several weeks. Maybe work up to it, but make sure she understands why you want to do this. Make sure the risks are minimal, to ourselves and the new intelligence. Get secured hardware. Keep it secret. Keep it safe."

I paused to consider. "Even from me."

"I'm not sure I want to risk creating another like myself," Pooja said at almost a whisper. "The world isn't kind to people like you or me. They might not thank us for creating more of...us."

I put back on the AR glasses and started to outline new plans. Big plans. "So, let's make this world one where both our peoples prosper."

"Yes, Calculator."

My grin was almost painfully wide. "I'm sure you and Oracle will make great moms."
 
Last edited:
Back
Top