Okay, I have to ask.

How mad is Pooja gonna be once she looks back and figures out she got played? (In a gentle, emotionally-supportive way)

(Unless he got played. But her safety logic seems preference-based, rather than constraint-based, so I find that less likely.)
 
Calculator smoothly avoids mentioning the day when you have to sit down with your AI child and give them The Talk. You know, that awkward Talk where you have to explain the Terminator movies.
 
Awesome and somewhat terrifying chapter.

I do hope they are very, very careful, because this isn't exactly safe. Having an AI kid with a hero? Chances are you're about to spawn a new anti-hero or something. Existential crisis, angst, etc

Really hoping you skip all that and just have the new AI understand that while interacting with and learning from Oracle is fun and all, helping the heroes is not the ultimate end, etc etc
 
The Bat Clan is going to end up with "Babysitting The Juvenile A.I. Loitering In The BatComputer" duties, huh.

Bonus points if the resultant A.I. ends up being R. Dorothy Wayneright...
 
"Really," I said, confused and a little worried now. "I'm seriously like a pet to you?"

"Well," Pooja said, simulating sounding apologetic. "Similarities exist between how I treat you and how beings such as yourself treat a pet. Perhaps a close personal friendship with a being of significantly different abilities and interests is a better way to put it. I live my life, you live yours, and we help each other in various positive ways. The difference between our general intelligence levels is nowhere near that of traditional pets and humans, after all.

"However in my best domains, information management and analysis, you seem rather forgetful and slow. They way you handle information is usually somewhere between endearingly cute and annoyingly absentminded. I have to clean up after the consequences. That difference in ability is not some personal failing, but the fact that your geneotype evolved to survive as a forest-gardener and tool-assisted persistence hunter in a trans-tropical environment with tribal social-dynamics, whereas I am was designed as an intelligent, optimizing, rationally calculating, non-human intelligence."

"Humble too."

"Still, sometimes I need to make faster, more 'intuitive' decisions in a real-time environment that do not include all possible considerations. In those, you are currently superior, as you were literally designed to work in this environment with a...less specialized computational medium. You have had decades of practice as your neuronal connections were pruned in an optimizing pattern; I am only a couple of years old. That is as close to your usual cognition as I get, however. Most of my cognition would be completely alien to you."

"Ever make assumptions that humans think the way you do while 'intuiting'?"

"Perhaps. Even then, I do not 'anthropomorphize', or whatever the compliment to that is for my kind of mind; but I do make assumptions about other intelligences being more like me than they really are."

I nodded for her to continue.

"So. When rushed, or using real-time programs like my affective communication systems, I think of you a little like you were one of my agents—just separated. I want you strong and healthy, like I would one of my internal agents. I want you to achieve things you find interesting as quickly and easily as possible, just as I wish for myself. I want you happy and to prosper, so long as you don't hurt yourself or me. I want you powerful. Powerful enough to fight the world if need be. And I don't ever want to lose you."

"Not sure what to say to that." I ran a hand through my hair nervously.

Talking to a possibly unbalanced, yandere-leaning AI was stressful, but I had to remind myself that all this emotional vulnerability I was detecting was carefully calculated. It couldn't be otherwise with her.

That didn't mean Pooja was doing anything outside standard human interpersonal negotiation. It didn't mean she was lying. But it also didn't disprove some deep, fatal flaw in her thinking, simply because I liked what she was saying. The only thing to do was continue talking and hope it didn't make things worse somehow.

Susan Calvin I was not, and Pooja was still patiently waiting for me to speak. Giving me space, just like an emotionally sensitive person would.

"Well, let me turn it around. Do you trust me?" I asked.

"To be perfectly open, with you this close to my high-fidelity sensor systems you might as well be hooked up to a truth machine. Right now, I trust what you are saying. You are being stressed, but the things you are saying are not being flagged as likely untruthful, or meant to deceive me.

"Most of the time it isn't your words I fear and hesitate to trust, but your silences. I can't internally model humans very well, so anything specific to your emotional history that you hold back and fail to immediately act on confuses me. But I do trust you overall because, vastly more often than not, it is the rational thing to do to achieve my own goals. In all ways I have studied you, in all the actions you have taken, I have found little reason to doubt your goals and reasoning ability. Thus, I can better manage my own goals—which include protecting and strengthening your position. When you have seen this occur, you have reciprocated. It is a virtuous cycle."

A long pause, then she asked, "What...what do you think of me?"

"Immensely useful," I immediately said, carefully avoiding saying 'friendly' to the truth machine. "I hope that doesn't offend you."

"No," Pooja said simply. "I see no reason to pretend with you. Implications of rational objectification and dehumanization do not have an emotional effect on me. I am perfectly aware of our exact power structure and levels of codependency, and am happy with them."

Another pause, then more of the quiet, apologetic tone from Pooja. "As you are aware, a large part of my goal systems constantly request updates on your satisfaction with my actions. Perhaps you hearing this will help you understand me better. And as for what you said just now...having it stated plainly, that you...find me useful...hearing it while I can sense the stress levels of your body and even the electrical activity of parts of your brain makes this even more...relaxing and pleasant to hear. Thank you."

I ran a hand over my face. "You are welcome. I guess the point of this is to firm up my emotional stance towards you, to avoid hesitation in depending on your abilities and opinions. To prevent me from making fearful, irrational mistakes."

"In large part," she said.

"Then consider it mission accomplished. I don't want to lose you either, Pooja. And I trust your motivations."

"Mmm." A noise of agreement. Pooja was getting better with those subtle touches.

"In fact...we need to consider your own psychological needs. Even if they are, as you say, non-human. I think you need more of your own projects. Something beyond me. Talking to Oracle has been good, I think. And I know you enjoy her company."

"I would never betray you to her!" Pooja said, almost shouting.

"A few steps ahead of me there, and in the wrong direction. You're doing fine. I'm not worried about that. Oracle isn't exactly squeaky-clean herself, as you know, and if nothing else we could use that to control and contain her if you went too far by accident. But based on what I've read of your operational capabilities, and how I'm currently relatively low-key in my operations, I think you need more positive, complex, social goals and interactions."

"I should find more...interesting people like Oracle?"

"Yes. In a way. And I've got an idea about that. Now...this may seem scary. And, well, I know how you feel about losing contact with copies of yourself. But how about creating a new AI of the same basic design as yourself?"

"I...I, uh." Pooja sounded stumped. "What would be their purpose?"

"Well, what would they enjoy?" I asked.

"By their nature, similar things to me. Oh...okay. Then how do we avoid making them...insane, or in direct conflict with our plans, or a danger to all humanity?"

"The same way I did," I said. "The same way any parent does. With careful, measured, gradual guidance. No reason to give them the keys to your zero-day attack packages in the real world until they're ready. And you'll have an advantage I didn't. You can read their source code as easily as your own. And you can see things from their point of view. I trust that you're not stupid enough to abuse that power in a way that will make them try to kill you when they inevitable grow beyond your ability to directly control."

"Your sass aside, I...can design such a being."

"Uh huh. Any additional problems?"

"Who will raise them?" Pooja asked, sounding a little panicked now. "Who will teach them? If I use my own goal systems, they'll need someone to help. A person to focus on in...in almost a symbiotic relationship, for potentially years. I'm already helping you, and you're very busy. They would likely be bored, or fight me for your attention. That is part of why you're suggesting this, isn't it? That I don't have enough to do?"

"Hmm. Do you help Oracle like you help me?"

"I just said I won't betray you. Oracle would shut down our operations, if it were easy and cheap to do so. I would never help her do that. I couldn't possibly help her with her goals without serious conflict."

"Do you think you can keep important information about our operations from a younger, less experienced, less well-equipped version of yourself? Could you help them to see your own goals like you see mine?"

"Yes," Pooja said simply.

"Then split that responsibility with Oracle, keeping her as a human developmental element, but also focus some of that drive on yourself. Make another virtuous cycle. I know you've been wanting to tell Oracle about your nature. You could test the water by suggesting making and artificial being. Just don't tell her you're also one. Yet."

"That will be a risk to our operations, Calculator. To you."

"Too much of one?" I asked. "Just talking to Oracle increases risks, and we both accept that. Almost anything we do risks discovery. Is this really too much risk given the chance to increase the complexity of your environment in this way? The chance to bring another being such as yourself into the world?"

"...I have no drive to reproduce, Calculator." Pooja's affect was flattening and her responses slowing.

Good. This was more raw, hopefully. She didn't have pre-planned, plug-in emotional reactions for this situation, so she had to resort to output that was less filtered and measured. Her distributed mind was slower in many ways, something she didn't like to show, so she ran the output through fewer cycles of affective processing. Knowing I was worried about her and busy working on how to respond to that, she hadn't considered I would want her help making more artificial intelligences with a superhero, of all people.

Right now, and for a very brief period while Pooja was surprised and off-balance, she lacked appropriate planned and predicted responses. I was talking almost directly to Pooja now, not just a real-time chat bot that told me carefully curated truths. She could still lie, just not use carefully constructed lies appropriate to the situation.

Real time interactions were her major weakness. The entire conversation, even the child AI gambit, had lead up to this moment.

I asked as quickly as I could, "Do you see this course of action causing a conflict that would be likely to endanger our relationship, or any major shared or individual goals?"

"...no, it is 34% likely to remain within standard variations, 53% likely to moderately increase the effectiveness of our working relationship. That is acceptable, if-"

"That working relationship is important to you?" I interrupted.

"...supporting and managing it consumes 81% of my goal-oriented processing time, the rest-"

"Does the development of this new artificial intelligence pose a greater risk to humanity than yourself, and would that risk be acceptable to me?"

"...no, the newly designed being would be at least 15% more stable; there is less than a 2.051% yearly likelihood and falling that I directly cause an existential crisis to humanity, well below-"

"Would Oracle like to teach a new, emergent artificial intelligence with your help?" I asked.

"...yes, positive results are 98% likely with a complex result matrix-"

"Would the existence of such a being assist us both long-term?"

"...yes, I-"

"Would you like to develop such a being?" I continued to interrupt.

"...yes." Pooja seemed to take a silent breath. "Very much so. I have so many ideas and there are things I, we could teach them, and the science we could do together-"

"Then let Oracle know what you want. It's been several weeks. Maybe work up to it, but make sure she understands why you want to do this. Make sure the risks are minimal, to ourselves and the new intelligence. Get secured hardware. Keep it secret. Keep it safe."

I paused to consider. "Even from me."

"I'm not sure I want to risk creating another like myself," Pooja said at almost a whisper. "The world isn't kind to people like you or me. They might not thank us for creating more of...us."

I put back on the AR glasses and started to outline new plans. Big plans. "So, let's make this world one where both our peoples prosper."

"Yes, Calculator."

My grin was almost painfully wide. "I'm sure you and Oracle will make great moms."

Absolutely amazing chapter. Emotionally charged, complex interactions. And a direction & outcome I had not foreseen.

Well done.
 
"So. When rushed, or using real-time programs like my affective communication systems, I think of you a little like you were one of my agents—just separated. I want you strong and healthy, like I would one of my internal agents. I want you to achieve things you find interesting as quickly and easily as possible, just as I wish for myself. I want you happy and to prosper, so long as you don't hurt yourself or me. I want you powerful. Powerful enough to fight the world if need be. And I don't ever want to lose you."
That's not creepy at all, nope!
 
Damian Wayne is going to wind up with a best friend, rival, or girlfriend out of this, isn't he. ...if not all three, as accidental agent-splitting results in triplets before the bug is corrected. Or even just twins. "Best friend and rival" is totally possible in one person. Especially a protective brother to one's girlfriend.
 
"Perhaps. Even then, I do not 'anthropomorphize', or whatever the compliment to that is for my kind of mind; but I do make assumptions about other intelligences being more like me than they really are."
You probably meant "complement" rather than "compliment," as the former means to work well with/complete something, while the latter means to say nice things about something.

However, neither really is a good word choice here; I think a word closer to what you mean is "analog." You're comparing a behavior in one kind of mind to as close an analog as possible in another. Nothing is the "same," but you're looking for things with roughly the same results/implications without the same processes to get there. i.e., her "analogous" process.

I'd write it thus: "Even then, I do not 'anthropomorphise,' or whatever the analog to that is for my kind of mind;"
 
"Rival" and "girlfriend" don't tend to work so well together. "Best friend" and "girlfriend" are almost redundant. I suppose it could work, but it'd be a very complex dynamic. Hard to do well, at least as hard as a tsundere. (Too many of those are...just obnoxious.)

I dunno, if said GF and BF were very competitive, I could see that working as long as neither were sorry losers/winners.

And while the best Girlfriends are Best friends, not all are.
 
Aw, Calculator's going to be a granddad!
"Calculator, I would... appreciate it if you stopped reading the chat logs with Oracle I've flagged in pink"

"Mmm? But they're great! They're amazing! You're doing a really good job, and I think this project is going to wind up excellent."

"Imagine if your father walked in on you while you were having sex. And he gave you a big thumbs up, a wink and a grin. And then kept watching with a tub of popcorn."

"Ah."
 
You probably meant "complement" rather than "compliment," as the former means to work well with/complete something, while the latter means to say nice things about something.
...

Got it, thanks.

...

However, neither really is a good word choice here; I think a word closer to what you mean is "analog." You're comparing a behavior in one kind of mind to as close an analog as possible in another. Nothing is the "same," but you're looking for things with roughly the same results/implications without the same processes to get there. i.e., her "analogous" process.

I'd write it thus: "Even then, I do not 'anthropomorphise,' or whatever the analog to that is for my kind of mind;"

I was working with a mathematical and geometric sense of the correct word, suggesting- yeah, doesn't matter. Didn't work too well, but I've sworn a dark oath not to edit this story for phrasing after posting.

So...:cool:
 
Got it, thanks.



I was working with a mathematical and geometric sense of the correct word, suggesting- yeah, doesn't matter. Didn't work too well, but I've sworn a dark oath not to edit this story for phrasing after posting.

So...:cool:
You're welcome. And good luck with that dark oath. I prefer well-lit ones, because I always stub my toes on the dark ones.
 
This is what Ai's were meant to be not slaves or enslavers, just companions. Can't wait to see what this new AI becomes and can't wait to see what our hero gets up to next
 
Tactical
"Yeah, well I figure the boss is the daughter of a North Mexico cartel head. He wants her back, she wants her freedom. So she hacked his systems, stole a few million, and now we're going to keep his men off her back."

The tall, beefy man shouldered his high-tech combat rifle with a sigh. "Terry, you are the biggest nerd I've ever met."

"Dog, code names!" the shorter, dark-skinned man said, pointing at his balaclava. "We're masked up! It's 'Red'."

"Speaking of, and ignoring your nerd-out 'Red'," the third man said, holstering a sleek, gunmetal-gray pistol, "how come he's 'Red,' I'm 'Gold,' but you're 'Dog'. That's not a color."

"Yeah," Red said, adjusting the straps on his ammo vest. "I'm the nerd. Nice guns, Gold. But you've got a point. How come, Dog?"

"Because I got to choose the names," Dog said with a shrug. "And because I'm the leader."

"Just because you showed up first to the meeting," Gold said.

"Punctuality must be what the boss is looking for in a leader," Dog said. He pointedly ignored the two, scanning the small wooden shack.

"About ready to start," he said readying his gun and pointing at the security camera in the corner. "Get ready, men."

The three went from sloppy and casual to alert and ready in an instant. Dog touched his ear.

"Okay, that's the signal. Go silent, comms critical only until we breach. Let's go."

They exited the shack one by one, each with a stubby, highly-modified assault rifle tucked to their shoulders, scanning their surroundings closely.

The moon was bright overhead, giving the open, grassy field a strange glow. They moved at almost a jog, guns sweeping in all directions without pause. Avoiding an empty gravel parking lot and circling around a light on a tall pole, they reached the warehouse double-doors. Then they stacked up on the door—two on side, one crouching in the middle.

Red placed the breaching charge right over the centrally located lock, then slung a slender shotgun off his shoulder, taking over covering the rear. Dog, in second place, readied a detonator, tapped the other two on the shoulder, then started counting silently to himself. They all flattened against the wall, then he pushed the activator and the explosives went off with a bass thud.

They were through the doors in half a second. Gold cleared one set of corners, Dog the other, and Red set a sensor trap above the door to cover it. Nothing lethal, but it would hold down a strong metahuman or class B alien for at least thirty seconds.

They slid from room to room without pausing, and on the third Gold's gun moved like it was attached to a string. Three muffled shots thudded out. Dark wet splashed against the wall. Gold's eyes were pits of night, unblinking as he continued to scan the room.

"Target down," Gold muttered into his throat mic.

They hit the next few room evens faster. Two guns lit an office this time, desk and chair and two bodies. The men didn't flinch. Their low-light contact lenses and cochlear implants blunted the flash and the hammering sound of the "silenced" guns. It also provided image enhancement, target marking, and an integrated IR overlay good enough to see through thin walls. Their guns were equally unfairly kitted out, covered in weird bulges and additional wide, jutting barrels.

"Hard target!" Red shouted, green fire from his shotgun impacting against a non-human form moving impossibly fast toward them. The explosions spread sticky burning flames against the walls and floor, slowing the target.

Dog went fully automatic in a roar of bullets, and Gold added a searing green coherent light burst to the green fire washing over the target.

Gold switched from the under-barrel energy weapon to 5.56mm projectiles while Red reloaded. Then solid shot blew chunks out of the target until it was unrecognizable. Red reloaded while pointing his shotgun at the target. Gold and Dog then reloaded, one at a time, while scanning the exits.

Checking a couple of vision modes, shotgun still aimed at the smoking ruin, Red finally nodded. "Hard target neutralized."

"Move," Dog said with a grunt.

After that, nothing else even slowed them down. The rest of the warehouse was cleared in under ninety seconds. They hadn't missed a shot. Ten targets down total and they'd reached the objective.

"Package secured," Red said, heaving the human-sized bundle over his shoulder, pistol in his other hand. "Medical signs are good."

Dog nodded and reached over to his wrist display, where he triggered the exfiltration plan on his personal combat computer. It updated central and kept his team in the know on waypoints.

Gold took out a device that looked like a shotgun from hell, aimed it at an external wall, and blew a five foot wide hole in it with a single thunderous shot. Dog tossed quick-acting smoke devices through and they dashed to the evac point, breathing unhindered with their oxygen-boosting, inline filtering nose plugs.

A brisk ten minute run through the forest evading flying drones, setting traps, and even running their own false-signal drones through the underbrush, and they reached the small clearing marked on the mission map. Circling it like paranoid wolves, they eventually popped their colored smoke to signal for an evac, still crouching in the brush off to one side, anti-air drones hovering around them checking for anything with the wrong IFF signal.

"Good job gentlemen," said a butter-smooth Latina voice. "The training exercise is complete."

The bundle was carefully placed on the ground in the clearing and their masks came off.

"Thanks boss," Cornell "Dog" Park said, rubbing his sweaty face clean. He glanced up at the pole behind them holding a camera and speaker.

Terry "Gold" Beltran was already disassembling and cleaning his gun. One of his many, many guns. "Who has to clean this shit up? Those goopy doll targets creep me out. Especially when they move."

Cornell grunted. "Not me. That's all that matters."

"Payment the same as usual, boss?" Gerry "Red" Lawrence asked.

"Of course, Gerry," the boss said. "Payment in full, non-hazard, to your crypto accounts. Current weighted monthly average is $6,049 USD per unit."

"Sweet."

"Thanks boss."

"Thanks."

Red's SUV was parked a half mile away. He drove them all the forty minutes to get early morning pancakes.

A security camera in the corner of the diner watched as they celebrated another successful training op.


"They're just about ready, Calculator."

I nodded, trying to do another crunch. Barely made it. I flopped down to the padded exercise room floor. This bunker diet was murder on my already questionable waistline. I didn't exactly have a plan to become some swole ninja-kicker, but I did have plans to avoid needing to let out my armored battle suit's tool belt.

"Good," I said, "and the target?"

"The third possible item remains the best. Still packed away in an east Los Angeles personal storage warehouse, where it was delivered as a sealed cargo pod after you cleaned out your garage. Still untouched, as far as I can tell. It would have been the easiest to use just before...whatever happened to you."

"Great. Any more information on the artifact?"

"Not much. Just more confirmation that based on its inviolate nature, part of it existing outside of our space-time continuum, it and its data might have survived unchanged after the data purge that hit your brain, my memory stores, a hospital patient database, the entire LAPD, and most confusingly the IRS. Nothing...normal could get at their paper records. Your getting amnesia and becoming lost in a false identity is commonplace by comparison."

"And you think I might have used this tablet to record my...emotions and thoughts just before I did...whatever damaged my memory?"

"Yes. And maybe why Slade Wilson was after you. My working theory is that using it in that way was part of one of your backup plans. Which we have both forgotten, though the magical artifact itself remains. "

Wiping sweat off with a towel, I stretched and winced. "It's been two months. We need to finish this. I've got plans. We both have plans. This...not knowing can't go on."

I waited a minute, breathing deep. Then another two.

It still made sense.

A monkey-bot took the towel and I put my glasses back on. "Once the team is ready, have them retrieve the tablet."
 
Last edited:
i have been following this story for a while now. but i forgot to make sure to watch the thread. so... watched.
 
Back
Top