A cyborg in the Wasteland [Fallout] [Self-insert]

That's "curing autism" talk. It can only be done by people who completely fail to understand the topic.

Many of the strongest benefits of my condition come from the absence of something you have trouble conceiving of living without. Being given that absent thing would then rob me of what I have attained due to its absence.

I don't want it, and if you tried to force it on me I'd genuinely be willing to kill to prevent that from happening. You'd be murdering eventing that makes me me, after all.

Other aspects come from my being differently cabled in the brain. Not so much missing something but having things wired up in a way that produces non-standard function. Adding more processing power won't ever make those functions standard, just more performant.
We're literally talking about removing the downsides while keeping the benefits of autism.

I'm asserting my belief that being able to read people is not a core part of my or anyone's personality. I think I'd still be me regardless of if my ability to understand other people was removed or even if I suddenly gained the ability to understand the emotional cues of all other animal species. With this as a base I'm assuming it's hypothetically possible to give someone with autism the ability to read people without removing any other abilities autism confers.

I'm essentially positing that the brain isn't a zero sum game where in order to gain something you have to lose something else.

I believe it's a distinction between mind and consciousness.

I think the best analogy is if your life is the equivalent of a story that you are reading to yourself, and that is your stream of consciousness. The pages at the start (your childhood) may be very different in pacing than the later chapters. They may even involve completely different characters from those at the start. But there is an overarching narrative - events connect to each other.

Forking and augmentation are best thought of as novel narrative techniques - perhaps a scene is described from several points of view, perhaps there are 'inhuman' points of view, poetry, illustrations, different characters take over the narrative. All of these can be easily done while following the same story from start to end. (Or indefinitely).

However, some things could very well equate to ending the story, for good, and starting a new one with a new narrative - wherein the events of the previous story do not inform the current story in any real way and essentially the 'author' of the story has changed. To many in Eclipse Phase, this is the equivalent of creating a designer fork of yourself that is not sufficiently informed by your prior consciousness. It's not a fork, it's a child. That's not necessarily terrible, but it's not you.
Yeah I agree that an altered fork of you wouldn't be the same person, but that's mostly because unlike Meimei I don't consider forks to be the same person as me to begin with. I'd consider myself to be the same person if someone else edited my mind as long as my memories were the same and the parts edited weren't my core drives or goals.

Becoming a hyperintelligent god probably would screw her over, frankly. It's a fantasy pipe dream and I doubt she would have ever been able to actualize it. A hyperintelligent mind would have less in common with her than she has with a dog - it could only be her in the loosest philosophical sense.

The issue with 'fixing' her, though, isn't whether it's theoretically possible, but whether it's safe to do so with the tools actually available to her.

Only the TITANs truly understand human minds. I scan you when you're socializing, maybe loosely identify a part of your brain you use for that. But that's it - there's a limit to how much we can narrow it down.

And then there's the problem 'neurodiovergent' alludes to - a brain that's different. It's funny that SpiraSpira mentioned Daredevil, because it's a known thing that blind people's visual cortex can be repurposed-- literally, the neural tissue ends up performing other jobs. For the sake of argument, suppose we isolated your 'superior' social-meats - where do we put it, if MeiMei has that region of the brain doing something else that - by virtue of being neurodivergent and unusual - we maybe can't properly identify? There aren't labeled i/o ports within the brain, you know!

There are no strict rules for how things are laid out. If there were, the apprentice wouldn't have needed to calibrate her neurocomputer to her sensorium despite Lily already doing so with her own.

But even if everything goes right, even if you can perfectly identify your social powers and perfectly insert it into MeiMei - this is not guaranteed to be helpful. I stress once more, she made the decisions that let her 'win' at life overall by other means. If she entrusts her fate to your social sense, she is changing that formula. Conversely, if she does not trust it, the entire operation was pointless.

And remember, we are not comparing this against doing without. Anything this direct modification could 'say', an external system could tell her in real time. As such, this is a completely unnecessary psychosurgery with essentially no benefit.
I agree that this might not be possible with Eclipse Phase take if only the hyperintelligent Titans are capable of doing it canonically.

As for where to plug in a social implant who knows, but it can't be that much of a problem since she managed the presumably much more difficult task of connecting her entire brain one node at a time to a computer and then mapping out what all those parts do with an expert system to properly pipe the data from her implants to her brain.

If there isn't space or memory enough in her brain to support an extra implant there certainly will be eventually when she finishes digitising her consciousness. Although I definitely wouldn't chose my social abilities if I was picking a set for a potential upgrade for someone.

I think the risks to her success would be low since if anything her lack of social prowess hindered Meimei's development. She had to build an entire space station just to get people to make deals with her as just one example. Besides if it by some coincidence the implant doesn't help or hinders her she could just disconnect it and go back to how she was before.

The benefit is that one of Meimei and Lily's stated goals was to understand everything, which is a bit hard to say you do if you can't even understand the reasons why other people are making decisions. Lily at times seems confused not only by what peoples emotional reactions mean but why people would have those reactions in the first place. If nothing else it would at least help at haggling! Or hiring people for her PMC which she currently relies on Gary's emotion sense for, presumably something that can also be replicated.
 
Last edited:
We're literally talking about removing the downsides while keeping the benefits of autism.

I'm asserting my belief that being able to read people is not a core part of my or anyone's personality. I think I'd still be me regardless of if my ability to understand other people was removed or even if I suddenly gained the ability to understand the emotional cues of all other animal species. With this as a base I'm assuming it's hypothetically possible to give someone with autism the ability to read people without removing any other abilities autism confers.

I'm essentially positing that the brain isn't a zero sum game where in order to gain something you have to lose something else.
I get what you mean, but one of the things this conversation edges to is the problem of the definition when something is different or when something is pathologic.
It's a sore point IRL, sadly, and with many disabilities, not just autism.

There's also the question of why it's a downside or if it's just modern society with its rigid structures creating that problem.
On the other hand, there are some things that are legitimately mostly a hindrance.
Like ADHD, as an example.
There are some upsides to it, that are pretty good from a survival pressure point. But they are only very situational, with it being a big downside most of the time. Mostly being the most aware and capable in extremely high stress situations and not freezing up in those situations.
As a downside, bad to no shortterm memory, anxiety, depression, lowered reaction speed outside of those high stress situations etc...
 
I get what you mean, but one of the things this conversation edges to is the problem of the definition when something is different or when something is pathologic.
It's a sore point IRL, sadly, and with many disabilities, not just autism.

There's also the question of why it's a downside or if it's just modern society with its rigid structures creating that problem.
On the other hand, there are some things that are legitimately mostly a hindrance.
Like ADHD, as an example.
There are some upsides to it, that are pretty good from a survival pressure point. But they are only very situational, with it being a big downside most of the time. Mostly being the most aware and capable in extremely high stress situations and not freezing up in those situations.
As a downside, bad to no shortterm memory, anxiety, depression, lowered reaction speed outside of those high stress situations etc...
Yeah I understand it can be a bit of a sensitive topic. I'm only arguing from the point of view of if we had access to handwavium that let us do essentially whatever we wanted in terms of psychosurgery.

If had access to that sort of thing I'd get rid of most parts of my ADHD no questions asked.
 
As for where to plug in a social implant who knows, but it can't be that much of a problem since she managed the presumably much more difficult task of connecting her entire brain one node at a time to a computer and then mapping out what all those parts do with an expert system to properly pipe the data from her implants to her brain.
[...]
I think the risks to her success would be low since if anything her lack of social prowess hindered Meimei's development.
Strictly speaking, the training of her neurocomputer is today's software tech, it's the implantation that's the quibble.

Her implant alters her inputs and outputs. Add things to senses, record senses. It doesn't directly mess with cognition. And "making her understand people" in a way a text popup can't, that'd be messing with cognition.

Her social prowess did not hinder her. Or rather, it only majorly 'hindered' her in that she recognized what people wanted, yet didn't want to give it to them. She needed her own station because she wasn't willing to be weaker than the state, not because she didn't recognize that they wanted her to be, well, weaker than the state.

The only way "improved social prowess" changes that is if it straight up changes what she values. Since what is good is defined by your values, changing your values is nearly always bad by definition.
 
Strictly speaking, the training of her neurocomputer is today's software tech, it's the implantation that's the quibble.

Her implant alters her inputs and outputs. Add things to senses, record senses. It doesn't directly mess with cognition. And "making her understand people" in a way a text popup can't, that'd be messing with cognition.

Her social prowess did not hinder her. Or rather, it only majorly 'hindered' her in that she recognized what people wanted, yet didn't want to give it to them. She needed her own station because she wasn't willing to be weaker than the state, not because she didn't recognize that they wanted her to be, well, weaker than the state.

The only way "improved social prowess" changes that is if it straight up changes what she values. Since what is good is defined by your values, changing your values is nearly always bad by definition.
Her medichines are programmed to work on brains and seem to work fine on regular brains as well as neurodivergent ones so they're clearly not one size fits all and must have some way of knowing what's a tumor and what's a novel part of the brain. Presumably she could use the same scanning capabilities to work out where an implant should go.

I'm arguing that there is essentially no difference between adding to your senses and messing with your cognition. I don't see the distinction between adding extra images to your vision and adding cues to your sight that you can understand and pair to people's emotional states. In that same way that reading people's mind's would be helpful and provide a lot of information but wouldn't change my values, being able to read peoples expressions and understand there emotions should be helpful and informative but not central to her personality.

Meimei was known to everyone as either insane or a terrorist and didn't seem to understand why people considered her that way? She also didn't seem to understand that some people considered their biosleeves to be them and that killing them would be analogous to murder. She may not have cared even if she knew but knowing things usually provides you with more options.

Given that Lily wants to be a warlord and start her own PMC I'm dubious that being able to tell how and why people are emoting would be unhelpful.
 
Last edited:
We're literally talking about removing the downsides while keeping the benefits of autism.
Exactly. That's not how autism works. I've said this multiple times now. You're working from the same mindset that implies one can "cure" autism.

You can't remove the downsides without removing the benefits. In many cases the benefits only exist because the drawbacks exist. No drawbacks, no benefits. Full stop.

You can't both see the music of the spheres and not see the music of the spheres.

I'm essentially positing that the brain isn't a zero sum game where in order to gain something you have to lose something else.

The brain isn't a zero sum game, sure, but you can't both have and not have specific functions. You can't both have and not have non-standard behavior in the same functions.

That's the entire point here. You can't both have and not have the same specific cognitive events. It just doesn't and can't work. This is binary.
 
Last edited:
I'm hoping Lily gets the cogwave jammer and research of Professor Calvert so she can get a better understanding of telepathy and create a telepath blocking implant.
 
Exactly. That's not how autism works. I've said this multiple times now. You're working from the same mindset that implies one can "cure" autism.

You can't remove the downsides without removing the benefits. In many cases the benefits only exist because the drawbacks exist. No drawbacks, no benefits. Full stop.

You can't both see the music of the spheres and not see the music of the spheres.



The brain isn't a zero sum game, sure, but you can't both have and not have specific functions. You can't both have and not have non-standard behavior in the same functions.

That's the entire point here. You can't both have and not have the same specific cognitive events. It just doesn't and can't work. This is binary.
How can you possibly know that? I agree it's impossible with current medical science but we don't currently have a full understanding of how the brain works. You can tell me you don't think it's likely that the benefits of autism can't be separated from the downsides, but unless you've recently completed some groundbreaking novel research on the human brain you can't tell me it's completely impossible.

Why can't you have both? Add a brain toggle or something if for whatever reason the two functions are completely incompatible. Switch on the social implant for talking to people, switch on the autism toggle for everything else.
 
Her medichines are programmed to work on brains and seem to work fine on regular brains as well as neurodivergent ones so they're clearly not one size fits all and must have some way of knowing what's a tumor and what's a novel part of the brain. Presumably she could use the same scanning capabilities to work out where an implant should go.

I'm arguing that there is essentially no difference between adding to your senses and messing with your cognition. I don't see the distinction between adding extra images to your vision and adding cues to your sight that you can understand and pair to people's emotional states. In that same way that reading people's mind's would be helpful and provide a lot of information but wouldn't change my values, being able to read peoples expressions and understand there emotions should be helpful and informative but not central to her personality.
I mean, she has programs and muses to help her weight that stuff already?

She just doesn't VALUE those estimations in the same way and doesn't WANT to.

Imagine you had a little sensor that fed you information on whether each decision you made was politically liberal or conservative, complete with which voter base you might upset or rally support from based on that decision. If you were a politician, this might be a very valuable piece of of software. Even if you weren't, it MIGHT be worth paying attention to from time to time.

But unless you want to BE a politician, you aren't going to base decisions on it. If someone were then to plop you into a legislative seat and ask you to vote on various bills and base your decisions on the political sense you'd now gained... Well, you wouldn't be happy, and even if you knew the political ramifications of your decisions because of the software... Would you care? Or WANT to care?

Lily's endgame was to be so hyper intelligent that she could essentially consider all the various values for her all decisions without compromising her value judgement- essentially she isn't making decisions 'like a politician' or 'like an emotionally compromised moron' because her understanding of everything encompasses emotion, politics, etc. as pieces of a whole. Politics and psychology aren't really politics and psychology, they're math basically. She doesn't want to be a politician, she wants to be a god. She doesn't want to be normal, she wants to be able to account for normal stuff without actually thinking about it or valueing it like a normie.
 
Last edited:
How can you possibly know that? I agree it's impossible with current medical science but we don't currently have a full understanding of how the brain works. You can tell me you don't think it's likely that the benefits of autism can't be separated from the downsides, but unless you've recently completed some groundbreaking novel research on the human brain you can't tell me it's completely impossible.

Why can't you have both? Add a brain toggle or something if for whatever reason the two functions are completely incompatible. Switch on the social implant for talking to people, switch on the autism toggle for everything else.

You either have autistic cognition or you don't. You either have systemic perception or you don't.

There is no amount of extra hardware that can change this fact. It is a logical binary. There's no middle.

The benefits and drawbacks are the same functions, separated only by context. They either exist or they don't.

Full stop. Do not pass go. Do not collect any amount of dollars.

I've lived with this condition my entire life. I've kept apace of the research and the psychology of it. I am profoundly more familiar with this than you are.

Take. My. Fucking. Word. On. This. Already.

You're not only wrong, you're violently wrong -- in the sense that following your reasoning produces real-world oppression and violence -- and it's only my decades of patience that's preventing me from taking this as a personal insult on behalf of all those who share my condition.

Get a clue already.
 
I mean, she has programs and muses to help her weight that stuff already?

She just doesn't VALUE those estimations in the same way and doesn't WANT to.

Imagine you had a little sensor that fed you information on whether each decision you made was politically liberal or conservative, complete with which voter base you might upset or rally support from based on that decision. If you were a politician, this might be a very valuable piece of of software. Even if you weren't, it MIGHT be worth paying attention to from time to time.

But unless you want to BE a politician, you aren't going to base decisions on it. If someone were then to plop you into a legislative seat and ask you to vote on various bills and base your decisions on the political sense you'd now gained... Well, you wouldn't be happy, and even if you knew the political ramifications of your decisions because of the software... Would you care? Or WANT to care?

Lily's endgame was to be so hyper intelligent that she could essentially consider all the various values for her all decisions without compromising her value judgement- essentially she isn't making decisions 'like a politician' or 'like an emotionally compromised moron' because her understanding of everything encompasses emotion, politics, etc. as pieces of a whole. Politics and psychology aren't really politics and psychology, they're math basically. She doesn't want to be a politician, she wants to be a god. She doesn't want to be normal, she wants to be able to account for normal stuff without actually thinking about it like a normie.
We know that Lily does want this ability though? She turned her muse off in order to train herself to read cues. Meimei probably wouldn't have cared at all but Lily definitely does.

I mean Alice was ranked only just above the point where Lily would die for her a while back, and considering Lily wants to live forever that's pretty damn important. So it seems like it would be beneficial if she could understand Alice better since she obviously cares if Alice is happy or not.

Even if she intended to almost never act on the information, knowing things is better than being ignorant. I mean I'd definitely take that politician sense even though I have no intention of being a politician.

I don't want her to change her emotional range, I just want her to be able to understand other people's emotional ranges. These are two different things that people keep conflating.

You either have autistic cognition or you don't. You either have systemic perception or you don't.

There is no amount of extra hardware that can change this fact. It is a logical binary. There's no middle.

The benefits and drawbacks are the same functions, separated only by context. They either exist or they don't.

Full stop. Do not pass go. Do not collect any amount of dollars.

I've lived with this condition my entire life. I've kept apace of the research and the psychology of it. I am profoundly more familiar with this than you are.

Take. My. Fucking. Word. On. This. Already.

You're not only wrong, you're violently wrong -- in the sense that following your reasoning produces real-world oppression and violence -- and it's only my decades of patience that's preventing me from taking this as a personal insult on behalf of all those who share my condition.

Get a clue already.
Clearly you have a great deal of emotional investment in this topic and I'm sorry if my opinions are offensive to you.

I agree with you that in nature these two things aren't found together. I disagree that this means it's impossible for them to do so. I'm not advocating for anyone to go out and start performing brain surgery. I'm arguing that if we had perfect knowledge of the brain and arbitrarily advanced surgical and cybernetic techniques that it would be possible to have both at the same time.

If you have evidence that this is impossible then link it or PM it to me and i'll capitulate.
 
I mean sure but she still has a perfect recording of every interaction she had with anyone for 300 years, including their expressions and her muses diagnosis of their emotional state to compare. If she wanted to teach herself to read people or even train a new expert system to point out what she should be looking for to make learning faster she should have plenty of data to do so.

I don't know if she has a 'perfect' recording. She has memory's, and they're not complete nor can they be fully trusted. And again, It also limits her growth as a person to actually continue to be reliant on such a system and it has been mentioned a few times so far that she wants to be more than she was in her last lives. By the way, she is also using the data to learn, otherwise she'd still be a murder hobo... Err... More of a murder hobo. It simply takes time to internalize some things and adjust. And I'll say it again, Lily HAS grown as a person. But like many geniuses, she will simply never be able to 'relate' on the same level as a 'normal' person might (I put quote marks around normal because I laugh when anyone uses that to describe people in general).

In the end, at some point you'll just have to accept that some people, Lily in this case, are simply unable, or cannot, relate to people as you seem to think everyone should be able too. Brain's are unbelievably stupidly complex organs and all it takes is ONE neuron to fire backwards and you've developed some new kind of mental disability and as such, there may never be a simple or complex 'fix' for it, just gotta use what you've got and roll with it.
 
Last edited:
You either have autistic cognition or you don't. You either have systemic perception or you don't.
The benefits and drawbacks are the same functions, separated only by context. They either exist or they don't.
Speaking as another with a similar neurodivergence, if I could add a chip to my brain that would crunch visual data on human faces and output calculated meaning to my subconscious, I would. To be fair, I would also do that for animals, trajectories, and non-natural knowlege as well, especially if it was toggleable. I don't think adding new senses would make me non-autistic, even if they are ones that let me seemingly patch my abilities with social cues, and I wouldn't suddenly become a more social animal either. I understand your view on the matter, I wouldn't accept an autism 'cure' for any reason, but I would still jump fully on a transhumance train in increasing myself socially as well as in every other manner.
 
Her medichines are programmed to work on brains and seem to work fine on regular brains as well as neurodivergent ones so they're clearly not one size fits all and must have some way of knowing what's a tumor and what's a novel part of the brain. Presumably she could use the same scanning capabilities to work out where an implant should go.
Come on, dude. Tumors are cancer. Recognizing cancer is not equivalent to identifying distinct brain functions. And even if it was, we can identify tumors from scans today, but you don't see anyone proposing brain tissue transplants (or surgery in general) to tackle brain disorders, bar the most life threatening and blatantly fucky like epilepsy. It's not my job to do your homework, and I'm increasingly running into wanting to reply with just "that's not how brains work" to your arguments.

I'm not telling you that in an ideal future some of this couldn't be done. But Eclipse Phase is not that future.

Meimei was known to everyone as either insane or a terrorist and didn't seem to understand why people considered her that way? She also didn't seem to understand that some people considered their biosleeves to be them and that killing them would be analogous to murder. She may not have cared even if she knew but knowing things usually provides you with more options.
I have no doubt that people explained in great detail to her why they thought these things. She "didn't understand" because she didn't agree with them.

Checking the text, the bit about the Jovians doesn't even have her complaining that they think they're the meat, but that they think she isn't a person due to lacking meat herself. And she can't be a terrorist because she's a machine. :o

Meanwhile, she's considered insane for things like planning to live forever.

I'd have to reread to be dead certain, but a cursory skim matches my impression - that she's only ever seen as "crazy" because she disagrees with people. And no amount of social-fu is going to change that. It could theoretically allow her to accomodate them better -- except if she wanted to do that, she could have already done it with the unholy power of delegation to someone with said skill. I reiterate, she is not poor.

Frankly, even if your perfect brain hack existed, the fact that she chose not to use other means is indicative that she would not choose to use the brain hack either; put simply, she doesn't value it.
 
We know that Lily does want this ability though? She turned her muse off in order to train herself to read cues. Meimei probably wouldn't have cared at all but Lily definitely does.

I mean Alice was ranked only just above the point where Lily would die for her a while back, and considering Lily wants to live forever that's pretty damn important. So it seems like it would be beneficial if she could understand Alice better since she obviously cares if Alice is happy or not.

Even if she intended to almost never act on the information, knowing things is better than being ignorant. I mean I'd definitely take that politician sense even though I have no intention of being a politician.

I don't want her to change her emotional range, I just want her to be able to understand other people's emotional ranges. These are two different things that people keep conflating.


Clearly you have a great deal of emotional investment in this topic and I'm sorry if my opinions are offensive to you.

I agree with you that in nature these two things aren't found together. I disagree that this means it's impossible for them to do so. I'm not advocating for anyone to go out and start performing brain surgery. I'm arguing that if we had perfect knowledge of the brain and arbitrarily advanced surgical and cybernetic techniques that it would be possible to have both at the same time.

If you have evidence that this is impossible then link it or PM it to me and i'll capitulate.


Dude, I'm not anwhere on the spectrum (that I know of) and you're badly annoying me too.

Why is it even relavant to this story?

You have had it pointed out multiple times that NO, EP does not have the perfect understanding to mess with brains like this, thus it is not happening in this story.

Why do you insist on continuing to expound on how great it would be to magically 'cure' someone of thinking differently to you through mechanisms lily doesn't have?

Speaking as another with a similar neurodivergence, if I could add a chip to my brain that would crunch visual data on human faces and output calculated meaning to my subconscious, I would. To be fair, I would also do that for animals, trajectories, and non-natural knowlege as well, especially if it was toggleable. I don't think adding new senses would make me non-autistic, even if they are ones that let me seemingly patch my abilities with social cues, and I wouldn't suddenly become a more social animal either. I understand your view on the matter, I wouldn't accept an autism 'cure' for any reason, but I would still jump fully on a transhumance train in increasing myself socially as well as in every other manner.

... Isn't this just the Social Assistant muse she already had? And which is not what is being referred to?
 
I understand your view on the matter, I wouldn't accept an autism 'cure' for any reason, but I would still jump fully on a transhumance train in increasing myself socially as well as in every other manner.
Oh no, that's a wholly different thing. Augmentative supplements that provide data in a digestible manner aren't altering your fundamental cognitive state. For example; autism is amongst other things characterized by an absence of the innate connection between emotion and emotive signals -- both internal and external. Having a brain chip that gave me a HUD display or something like it that told me what emotion I was currently experiencing would be immensely useful. But it wouldn't make that innate connection suddenly exist: I'd still have the omnipresent and undeniable disconnect between my inner passions and behaviors. Even with a supplemental chip to manager my non-verbal cueing.

Having those things would alleviate the burdens of the absence of that connection while allowing me to benefit from it; but it would only be a mitigation. At the end of the day I would still be a person in whom that connection did not exist, and that would always inform any evolutions of my inner experience over time, without exception.

The connection is either present or absent. It can't be both present and absent.

Transhuman neurotypicals and transhuman autists can and should exist side by side, each benefiting from augmentations that allow their unique distinctiveness to thrive.
... Isn't this just the Social Assistant muse she already had? And which is not what is being referred to?
Muses are slightly different but overlapping.

Her social assistant muse function for example only focused on external agents and not internal experience.
 
Last edited:
Dude, I'm not anwhere on the spectrum (that I know of) and you're badly annoying me too.

Why is it even relavant to this story?

You have had it pointed out multiple times that NO, EP does not have the perfect understanding to mess with brains like this, thus it is not happening in this story.

Why do you insist on continuing to expound on how great it would be to magically 'cure' someone of thinking differently to you through mechanisms lily doesn't have?



... Isn't this just the Social Assistant muse she already had? And which is not what is being referred to?
Look I'm having like 5 different arguments at the same time here cut me some slack if some of them drift together.

I'm currently arguing with Logos that I disagree that currently incompatable brain functions are completely impossible to acheive. At the same time I'm arguing with most of the rest of the thread that given the demonstrated capabilities of eclipse phase tech in this thread that they should be probably be more capable of psychosurgery then expounded on in the source material.

Her social muse is treated as something external which isn't helping her learn the skill like the difference between learning math's yourself or just having a calculator app open all the time. Lily clearly believes the first to be better since she turned most of the muses help suggestions off in order to learn better.

Come on, dude. Tumors are cancer. Recognizing cancer is not equivalent to identifying distinct brain functions. And even if it was, we can identify tumors from scans today, but you don't see anyone proposing brain tissue transplants (or surgery in general) to tackle brain disorders, bar the most life threatening and blatantly fucky like epilepsy. It's not my job to do your homework, and I'm increasingly running into wanting to reply with just "that's not how brains work" to your arguments.

I'm not telling you that in an ideal future some of this couldn't be done. But Eclipse Phase is not that future.


I have no doubt that people explained in great detail to her why they thought these things. She "didn't understand" because she didn't agree with them.

Checking the text, the bit about the Jovians doesn't even have her complaining that they think they're the meat, but that they think she isn't a person due to lacking meat herself. And she can't be a terrorist because she's a machine. :o

Meanwhile, she's considered insane for things like planning to live forever.

I'd have to reread to be dead certain, but a cursory skim matches my impression - that she's only ever seen as "crazy" because she disagrees with people. And no amount of social-fu is going to change that. It could theoretically allow her to accomodate them better -- except if she wanted to do that, she could have already done it with the unholy power of delegation to someone with said skill. I reiterate, she is not poor.

Frankly, even if your perfect brain hack existed, the fact that she chose not to use other means is indicative that she would not choose to use the brain hack either; put simply, she doesn't value it.
I don't know how brains work but according to this fic you can jam a bunch of nanomachines in there and then thread cables from every part of your brain back to a computer and then randomly fire off all your synapses to teach the computer how your brain works. This implies the computer knows how your brain works, which in term implies that you should be able to train an expert system to compare brain maps to each other to work out which parts do what and then have the computer fabricate whatever signals are required to generate whatever extra senses you want.

We already know this works since Lily is already doing it to see more bands of light and hear a wider range of sounds.

In the flashback chapter she literally doesn't understand why a Jovian is upset at her killing all his friends until one of her forks reminds her that they believe resleeving is the same as dying.

I don't know if she has a 'perfect' recording. She has memory's, and they're not complete nor can they be fully trusted. And again, It also limits her growth as a person to actually continue to be reliant on such a system and it has been mentioned a few times so far that she wants to be more than she was in her last lives. By the way, she is also using the data to learn, otherwise she'd still be a murder hobo... Err... More of a murder hobo. It simply takes time to internalize some things and adjust. And I'll say it again, Lily HAS grown as a person. But like many geniuses, she will simply never be able to 'relate' on the same level as a 'normal' person might (I put quote marks around normal because I laugh when anyone uses that to describe people in general).

In the end, at some point you'll just have to accept that some people, Lily in this case, are simply unable, or cannot, relate to people as you seem to think everyone should be able too. Brain's are unbelievably stupidly complex organs and all it takes is ONE neuron to fire backwards and you've developed some new kind of mental disability and as such, there may never be a simple or complex 'fix' for it, just gotta use what you've got and roll with it.
Word of god is Meimei's memories are intact in Lily.

I can accept some people are incapable of things, I definitely don't accept that no matter what it's impossible to do anything about it and you just have to suck it up. That's entirely contrary to the entire thrust of transhumanism and goes against the characters own goals in this story on top of that.
 
I don't know how brains work but according to this fic you can jam a bunch of nanomachines in there and then thread cables from every part of your brain back to a computer and then randomly fire off all your synapses to teach the computer how your brain works. This implies the computer knows how your brain works, which in term implies that you should be able to train an expert system to compare brain maps to each other to work out which parts do what and then have the computer fabricate whatever signals are required to generate whatever extra senses you want.
I'm feeling kind of done with the original argument, but I can try to explain this in and of itself.

The human brain is a neural network. Neural networks are also used in artificial intelligence / machine learning today -- perhaps correctly identifying pictures of a dog.

However, that selfsame dog-identifying AI is completely opaque; we don't understand, systematically, how it does it. We know how it learned - from labeled pictures of dogs - and we know the principles that set it up, but we can't point at some of the numbers and say "this is why it knows that picture is a dog."

This is a common problem with machine learning - that the result is a "black box" that "mysteriously" arrives at an acceptable answer.

Similar to how you can train this mysterious black box to identify pictures of dogs, Lily trains one to identify "her hearing music." The algorithm is aware of what note is being played in reality, and compares it against the brain. It can therefore develop an association between a given sound, and the way the brain looks if that sound is heard by Lily.

Once this black box can reliably identify what neurons consistently activate when Lily hears a given note, we're mostly done with sound. By utilizing the nanomachines in her brain to activate those areas, we can somewhat reliably reproduce her hearing that note. Some additional training might be needed, but it'd be in a similar vein.

Vision is much the same. We know what is on the tablet screen, and we know what the brainstate is. The black box learns what areas activate in accordance with a given image. This one is probably a much more difficult job, but it's the same in principle. Eventually, the black box knows that a given color in a given point in real space relative to the eyes activates certain neurons a certain way. It can use this knowledge to reconstruct what the eyes see, or by changing the state of the relevant neurons, alter what the brain sees.

I'd like to re-emphasize that we don't really know why the black box arrives at a given answer - it learned much as you might learn to block fastballs. You might know intellectually why you must do this and that, but the neurological process is not one of clearly legible physics equations - it's just a mysterious black box of trained reflex. So, too, is our software interface to Lily's brain.

At this point, we can record and alter what she sees and hears. Yet, this ability does not mean we understand anything at all. We don't know how she thinks. We only know what she sees and what she hears - no more, no less.

By these same principles, we can teach yet more black boxes to do things in response to thoughts. But it is yet again a monkey at a typewriter - it doesn't inherently confer deeper understanding. It just knows when this, do that.

Lily's technology is more sophisticated than this, but my point is simply that the ability to do the things it does, doesn't mean that you "must" have any deeper / holistic understanding of anything that the brain is actually doing.

edit: While in principle you can use this for anything, the more nebulous the job the harder it is to train. Training a system to recognize something like "social skills" would be many many orders of magnitudes more difficult than "specific sound / color."
 
Last edited:
I'm feeling kind of done with the original argument, but I can try to explain this in and of itself.

The human brain is a neural network. Neural networks are also used in artificial intelligence / machine learning today -- perhaps correctly identifying pictures of a dog.

However, that selfsame dog-identifying AI is completely opaque; we don't understand, systematically, how it does it. We know how it learned - from labeled pictures of dogs - and we know the principles that set it up, but we can't point at some of the numbers and say "this is why it knows that picture is a dog."

This is a common problem with machine learning - that the result is a "black box" that "mysteriously" arrives at an acceptable answer.

Similar to how you can train this mysterious black box to identify pictures of dogs, Lily trains one to identify "her hearing music." The algorithm is aware of what note is being played in reality, and compares it against the brain. It can therefore develop an association between a given sound, and the way the brain looks if that sound is heard by Lily.

Once this black box can reliably identify what neurons consistently activate when Lily hears a given note, we're mostly done with sound. By utilizing the nanomachines in her brain to activate those areas, we can somewhat reliably reproduce her hearing that note. Some additional training might be needed, but it'd be in a similar vein.

Vision is much the same. We know what is on the tablet screen, and we know what the brainstate is. The black box learns what areas activate in accordance with a given image. This one is probably a much more difficult job, but it's the same in principle. Eventually, the black box knows that a given color in a given point in real space relative to the eyes activates certain neurons a certain way. It can use this knowledge to reconstruct what the eyes see, or by changing the state of the relevant neurons, alter what the brain sees.

I'd like to re-emphasize that we don't really know why the black box arrives at a given answer - it learned much as you might learn to block fastballs. You might know intellectually why you must do this and that, but the neurological process is not one of clearly legible physics equations - it's just a mysterious black box of trained reflex. So, too, is our software interface to Lily's brain.

At this point, we can record and alter what she sees and hears. Yet, this ability does not mean we understand anything at all. We don't know how she thinks. We only know what she sees and what she hears - no more, no less.

By these same principles, we can teach yet more black boxes to do things in response to thoughts. But it is yet again a monkey at a typewriter - it doesn't inherently confer deeper understanding. It just knows when this, do that.

Lily's technology is more sophisticated than this, but my point is simply that the ability to do the things it does, doesn't mean that you "must" have any deeper / holistic understanding of anything that the brain is actually doing.

edit: While in principle you can use this for anything, the more nebulous the job the harder it is to train. Training a system to recognize something like "social skills" would be many many orders of magnitudes more difficult than "specific sound / color."
As I understand it though all sensory information is first piped through Lily's brain computer and then the expert system pipes that to the correct part of the brain. Given that Alice also has one of these the black boxes, that is the expert system running them, has already for Alice at least worked out how the things Alice sees translate into emotional cues? So if the expert system can figure it out for Alice why can't it do the same for Lily? Maybe retrain Alice's muse on Lily?

I mean earlier in the story Lily bypassed her startle response if I remember correctly, which implies she can turn on and off some brain functions and thus has some understanding of how the brain actually works rather than it being a complete black box to her.
 
Last edited:
Unrelated to all that, In a recent chapter Lily worked out how to make levitating drones with the info she got from Madison Li. Could she apply that technology to make flying power armour?
 
As I understand it though all sensory information is first piped through Lily's brain computer and then the expert system pipes that to the correct part of the brain. Given that Alice also has one of these the black boxes, that is the expert system running them, has already for Alice at least worked out how the things Alice sees translate into emotional cues? So if the expert system can figure it out for Alice why can't it do the same for Lily?

I mean earlier in the story Lily bypassed her startle response if I remember correctly, which implies she can turn on and off some brain functions and thus has some understanding of how the brain actually works rather than it being a complete black box to her.
When you train for sound, you know, objectively, when and what sound they heard. You try to match neural activity to this event. Same for visual data. Startle response? Externally measurable.

To train for emotion, you'd need a way to know when an emotion is happening and what it is, in order to try and match it to brain activity. Unlike the senses, we don't have external data. At best, you have expressions, but that's only good for replicating neural activity that causes expressions, not emotions. A quick trip to a classic AI gotcha, that. Alice could actively try to identify her emotions at all times, but that'd be a hot mess. Unreliable narrator, inconsistent timing, disruptive to the very phenomenon we're trying to observe...

It's not physically impossible, but like I said; orders of magnitude harder to train the AI. Depending on data quality (in this case poor) it can become computationally infeasible. Particularly as she's in Fallout with lesser kit.

And even if we somehow manage it, it's entirely possible those neurons do something different in Lily's brain. You replay "Alice happy" in Lily and she feels nauseous or something. By the very nature of the brain as a neural network, it's kind of variable. That's why Alice had to train sound and vision separately - because they could not rely on the neurons for that in Lily being the same in Alice.
 
Last edited:
Inviolable
Lily had been seeing Grace more often since they had settled in Megaton more or less full-time. Typically, she would have tired of a sexual partner by now, months ago even, but she had to admit that she enjoyed the woman's company, and she had worked up to spending almost as much time with her as she budgeted for Alice, which made her Lily's second favourite person and ranked as a high two on Lily's scale of people, by now.

Still, she still needed a period to recharge with solitude after being around her, just like the Apprentice and other people. The main difference between Grace and the Apprentice and other people was that their presence deducted much less mental fuel over time than other people did.

"Oh, are you going to serenade me? Where did you find zhat guitar?" asked Lily amusedly when Grace showed up with a traditional folk-style acoustic guitar one evening.

That caused the woman to grin, "Yeah! I've been learning how to play, and this guitar was handed down to me by my dad, who brought it from California."

Lily sat down to listen with her hands in her lap and said regally, "Okay, you may begin serenading me now.

"Well, perhaps not serenade; I don't sing too well, but listen to this," Grace said and began playing a very challenging piece with a definitive Spanish guitar sound. By the end of the short two-minute piece, she was clapping happily.

Didn't Grace say she "had been learning"?! Maybe since she was five, she was sandbagging!

It reminded her of a song that she could play on the guitar too, and she could sing too. She had sung before in both of her lives, but people often commented that she was technically very good but lacking something, so she had not often done it.

She had learned to play the guitar in her life in America, and although she definitely wasn't as good as Grace, there was one song that was very heavily inspired by a guitar solo by Isaac Albéniz called Asturias that she could play; it had a Spanish guitar sound to it, and she thought Grace would like the lyrics too.

"That was great! Okay, my turn..." Lily said and held her hands out for the guitar, which Grace handed to her after glancing at her for a moment.

Grace asked, "You can play the guitar?"

Lily huffed, "Of course! Now... uh... let me see if I can remember."

She had heard this song originally on that terrible but addictive short-video app shortly after what became known as the first pandemic of the 2020s, about twenty years prior to her death. That was about the time that everyone agreed that the world started turning to shit, not the exact point but more like the turning point.

By the time she passed away in 2043, instead of the expected ten billion people on Earth, the population had dropped to about six and a half, with nuclear fire touching the Earth four more times in twenty years. Odessa and Kargil were the first two incidents, and Tel Aviv and Tehran was the third.

When Hamas drove an Iranian-provided nuclear explosive and detonated it in Tel Aviv, Israel responded by nuking the Iranian capital to cinders and then deploying EMP weapons on all of the other Iranian cities, with the Israelis teaming up weirdly enough with two other Arab polities to destroy the Iranian state and military. Seeing Israeli and Saudi Arabian jets fly missions together against the Persians on CNN had been incredibly surreal, she thought. It was like reality was stranger than fiction.

Although amazingly, neither of the three incidents sparked a global nuclear war as it had in the Fallout universe, some felt it was very close, especially the first incident in Eastern Europe.

Shaking her head, she returned to the song. Still, it was a very nice song, even if it reminded her of the beginning of the world's downward spiral, so she reviewed it quickly in her head.

The guitar element of the song was a direct copy of that 19th-century Spanish virtuoso's work but simplified, which was the only reason she felt she could play it.

As for the lyrics, well, Lily always thought they sounded moderate to highly sapphic, so she felt that Grace would like them. After getting a handle on the guitar and playing a few test chords and a simple melody for practice, she nodded. If she had already replaced her larynx with a digital speaker, she could have copied the singer's original song, but that would be cheating, she supposed.

Still, even though she couldn't autotune her voice at the moment, she still brought up a simple auditory waveform analyser that would give her an alert if she started to sing out of key.

Grace was grinning and waiting for her to begin, so she coughed once and started to play the intro guitar solo, then began singing, "I summoned you; please come to me. Don't bury zhoughts zhat you really want. I fill you up, drink from my cup. Within me lies what you really want..."

The song she was singing was called "Middle of the Night", and she didn't think she sang it even half as well as the original artist. Still, she must have done it well enough as she didn't get past the second repetition of the chorus before Grace pounced her, carefully avoiding damaging her precious guitar.





She had a busy day the next day; she was replacing the trashed gas turbine on her captured Vertibird with a brand new one she had manufactured by the simple expedient of copying the working one. She didn't have a source for a hydrocarbon-based fuel to run it, though, and it only had about a half tank left in it.

Before she got the manufacturing details for fusion cores, she had intended to build a similar hydrocarbon fuel manufacturing facility using a lot of the power of the fusion power station. However, she wasn't sure what she was going to do with this repaired Vertibird now. She took detailed scans, and the airframe wasn't really suitable for conversion to a fusion platform as it was.

She could modify the design and replace the gas turbines with, perhaps, five fusion cores powering a robust and powerful electrical motor on each side. The motor would need to be cooled, but traditional liquid cooling would suffice at a low airspeed, with air-cooling at cruise taking over.

But it would take almost as much work to refurbish this Vertibird than it would to rebuild a new one to the new design, besides she didn't have the extremely complicated machinery necessary to build the fusion cores finished yet.

It wasn't that difficult to create a simple boron-impregnated metal alloy, but in order to be used in this type of fusion, it had to be accurately crystalised layer upon layer in interesting geometric patterns with atomic accuracy that she could not even achieve with her current generation of nanomachines.

She had the complete design specifications for the complicated machine that could do it, and she was sure she could build it. However, it was as big as a three-bedroom house, so she started a new digging project. Normally, the ceilings in her underground areas were a little over three and a quarter metres. Taller than normal buildings, as she wanted people in Power Armour and possibly large robots to be able to navigate it easily, but this would need more than twice that.

It was a long and complicated building project to dig out a single level that had ceilings that were over two times her normal. Special care had to be taken to ensure the stability of the project, so it didn't collapse, but she thought she could have the area ready in two months. Then maybe another month or two to build the machine, and she could start production. It was no wonder that the Brotherhood couldn't manufacture these fuckers. Could she instead just take over Raven Rock in four months?

Sighing, she shook her head as she supervised a team of robots using a crane to drop the new engine into place. Pausing, she noticed Grace approaching the outside hangar she was working in. Curious, she waited for the woman to approach.

"Ooooh, sweet, is this the one you shot down? I thought it would have been wrecked..." Grace said as she walked in.

Lily nodded at her; she had been in town when the Brotherhood had attacked and had been a little upset she hadn't gotten in on the action, "Yes. I targeted zhe right engine, and zhe pilot was able to land it; quite good work actually. I'm replacing the turbine engine now."

She walked around it and whistled, "Oooh, it's one of the first generation Pre-War models that are still powered by jet fuel... an oldy, eh?"

Lily blinked slowly at Grace and asked carefully, "What... else would it be powered by?"

Grace snorted, "Well, you didn't expect us to have a number of gas stations to stop at on the way from the west coast, did you?"

Lily had, actually. Although now that she thought about it, Gary had said those fuel-generating stations were only installed at Naval Air Stations, for sure. There was not a lot of call for the Navy in Colorado or Oklahoma, so there probably weren't any Naval Air Stations in those landlocked states.

Grace continued, "Even before Control Station ENCLAVE exploded, the eggheads had built fusion-powered Vertibirds. There are only a few places you can get fuel for these babies these days, although we could get some gas if you wanna fly over to Adams?" She asked, laughing up a storm at her own joke.

What? Lily had been smashing her head against the wall of a fusion-powered Vertibird for ... well, how long had she been here now? She was pretty sure she could do it now but was she just reinventing what some people had already done?

"Did zhey just put a bunch of fusion cores in zhem to power zhem?" Lily asked curiously, as that had been her solution.

Grace shook her head, then paused and then shrugged, "Well, it kind of looked like a fusion core, to be honest, but it was about the size of a hot water heater instead of the usual thermos of coffee size. They needed to be pulled out and refuelled on a special machine every two hundred flight hours; I know that."

What? The size of fusion cores wasn't solely because it was a small, compact and useful size. That was as big as they could make the devices individually. From the engineering discussion she had been perusing, Mass Fusion had tried to make over-sized fusion units because their efficiency would have been a lot better, but the peculiar reaction between protons and hydrogen wouldn't occur if the reaction chamber was too big, and that limited the size of the device.

Lily stood there pouting, which caused Grace to laugh and ask her, "What's wrong?"

"Nothing," Lily said, but the truth was she didn't like not being indisputably the smartest person alive.

'Well... chances are the person who invented this died of old age already,' Lily comforted herself with that thought.



Sarah Lyons showed up to her office as requested the next morning, "Hey, Doc. What's up?" Lily hadn't told the woman why she had requested the meeting, in part to see if she would ask clarification or just come anyway. Lily liked that she had just come anyway.

"I repaired zhe Vertibird I seized from your wayward souls. I'll return it to you for a favour," Lily told her, straight to the point.

Lyons widened her eyes, "Damn! I should have brought Ferguson; he was sure that machine was going to be down forever. Reports are the engine was totally wrecked."

Lily blinked at her and then said, "Ah. You still 'aven't considered the possibilities with zhe DMLS system I sent you." She sounded disappointed, like a teacher that got a poor book report from a student. "It included scanning systems to scan physical objects. I just copied every part from zhe working engine and 'printed' a brand-new gas generator in a similar alloy and put it on zhe airframe. I tested it, and it runs properly."

Although the Brotherhood would have had to disassemble a working engine and scan each individual part one at a time since the multi-millimetre wave radar scanner didn't penetrate like Lily's supertech one did but it was still possible for them; it just would take more time.

Sarah Lyons's face grew shocked for a moment before saying, "Wait! Are you saying that we could do this repair ourselves now? We have over a squadron of Vertibirds sitting doing nothing for lack of working engines." How many aircraft were in a squadron again? In her experience in the US Army, aircraft were formed into battalions as God intended. Perhaps Air Cav had squadrons, though. She thought it was over twenty, though.

Lily nodded, "Yes, I will include zhe CAD files for zhe gas generator as a sweetener if you agree to 'elp with gusto. It'd probably save your Scribes a lot of man-hours, but you could do zhe exact same zhing I just did even without this 'elp."

Sarah hummed and nodded, "You're right. I hadn't considered the possibility that these machines would have. Compressor turbines and rocket engines all could be built really easily now, huh?" Lily nodded at her.

Then she asked, "What do you need? I'd really like to get that Vertibird back, even if you just told me it isn't quite as important anymore, and I want those CAD files to bribe the material sciences Scribes with."

"Remember zhe seed vault locations you told me about?" She nodded, "Well, I want your 'elp to loot zhat one that is on Kent Island. The bridge is out; I've already checked."

Sarah made an 'ahhh' noise and asked, curiously, "Why are you so interested in these seeds or plant samples?"

"Besides zhe fact zhat zhey're the collective genetic 'eritage of 'umanity, zhat is?" Lily asked her, amused.

Sarah nodded, "Yes, besides that."

"Because I believe I could genetically modify a number of types of plants to grant them not only a resistance to radiation, but zhe capability to sequester radionuclides through a biological process," Lily told her, honestly.

Sarah asked, surprisingly insightfully, "Pull the radioactive particulates out of the water they drink, you mean?"

Lily nodded, "And the air if it was dusty. It might take thirty years, but eventually, DC and then the rest of the world might stop being a radioactive hell-hole."

Sarah smiled, "That's the part of you I like, Doctor. My dad says you're a high-functioning psychopath, but then you have these lofty ideas to make the world a better place. Yeah, we can definitely help you, especially in exchange for what you're paying us."

Lily didn't quite understand what Sarah was saying. She didn't think that those two things she mentioned were mutually exclusive. And besides, she felt empathy... sometimes, especially if she reminded herself to do so and made an active effort out of it.

Sarah then coughed, "Uhh... I didn't mean to call you a psychopath, exactly. I'm pretty sure my dad is wrong. Plus, that's one of the main reasons he agreed to work so closely with you. I'm not sure I understand why, either."

Lily hummed and enlightened the woman, "If zhat is what zhe psychological profile he had constructed of me says zhen 'is decision is likely because it would make me very predictable. I don't really agree with it either, but I am pretty predictable in a lot of ways."

However, she wondered how Elder Lyons had gotten enough data for his psychologist Scribes to presumably profile her. Did he suspect that she had something to do with the Outcasts' inexplicable incineration? If so, it might explain why he stopped the investigation and why he told his daughter that she was a psychopath.

Shrugging, she continued, "I'd like you to use your aircraft to take some of my men, and yours too, if you want to come, to zhose coordinates and bring everything you can back."

She already knew what her first priority to acquire was. If you considered this a terraforming attempt, then it became obvious which plant would be ideal in this hemisphere once modified. Kudzu.

However, just radiation-resistant cereal crops would be a real shot in the arm of civilisation around here. Virginia and Maryland had a lot of good farmland once upon a time. It was still there, except for the contamination.



It took a week and a half to organise the mission to go to the seed vault, but Lily was surprisingly not involved one whit.

Gary had been keen to get involved, especially when Lily had mentioned the purpose. The Brotherhood had been really happy with their returned Vertibird and thrilled with the CAD files and the idea that they could get their fleet of aircraft that they had been mainly cannibalising for parts in the air. There was already talk of training more Initiates as Pilots.

Lily sent him through a VR Power Armour familiarisation course and gave him a set of T-51B armour and a new plasma pistol. He already had the old plasma rifle that they had used so long ago in their ant eradication mission.

Gary didn't hide his past with Sarah Lyons, and she and the rest of the Brotherhood were amazed. Although they still had an ongoing conflict with the Enclave, they still had a lot of reverence for people who were in the US military during the time of the Great War, just like their founder.

Lily gave all control of the operation to Gary and the Brotherhood, putting them in command of her soldiers, who seemed very eager to fly on Vertibirds, while another two squads were going to accompany her on a different mission.

Sadly, when you became a Dictator, it became a lot more difficult to just sneak off on your lonesome, so she was taking her armoured RV and two trucks filled with soldiers and robots.

The Apprentice actually wanted to go with Gary and explore the Vault, which Lily encouraged as she felt that her own destination was dangerous. Well, if not dangerous, then at least uncertain. Although, the Brotherhood team looked askance at the young girl in her obviously much sleeker and larger hardsuit. Pinker, too.

It didn't help when Alice asked one of the Paladins, "What, you mean your armour doesn't have built-in weapons systems or a battle management computer?"

Lily sent the girl a text message: ixnay, and she stopped showing off.

Her roof could fit two Vertibirds, and Lily watched them arrive two at a time to pick up a squad of her men and robots and Gary and Alice before flying off to the southeast in formation. Apparently, they would refuel, and from there, they would hit the site. All of the Vertibirds had both GPS systems, including moving maps, even if the maps hadn't been updated in a while.

Leftenant Wilson asked, "So, what's so special about this Dunwich company, ma'am?"

"I don't know, but I am going to find out," Lily said earnestly. She had been putting off her curiosity about this place ever since she got here, and she just couldn't avoid it anymore. It was like a junkie with a pile of opiates in front of them; you couldn't expect her not to partake, even if it was dangerous.

If it killed her, she had left a complete download of her ego and PC data dump in a small device in the Apprentice's room, along with instructions on how to clone her a new body and transfer the memories over. Lily no longer thought that such a replacement would still be her, but at the same time, it would be the best thing that wasn't her that someone very much like her continued to exist. Still, she would do her best not to die.

The drive was long and uneventful. The Dunwich building was on the far left of "the map" in Fallout 3, which turned out to be over a hundred kilometres from Megaton, so it took several hours of driving around to get there.

Lily had to stop the team of humans from going inside, she shook her head, "No, just robots. You guys stay out here and guard the perimeter."

That surprised them, but it was the reason she brought so many robots with her. She glanced at the twenty Kaytrons, "Lasers only; I don't want to burn zhe building down. Sweep and clear the entire building and basement levels. Execute."

There were diminishing returns with Kaytron intelligence gains due to their distributed network, so twenty Kaytrons weren't appreciably smarter than eight, and a thousand wouldn't be appreciably smarter than twenty. That was partly by her design in keeping the bandwidth available for intra-Kaytron communication limited but mostly because stable distributed intelligence was a very hard problem.

Still, they nodded as one and proceeded into the building. Lily wasn't watching through any of their sensors but was following their broadcast actions and intra-Mesh communications, and they systematically cleared all of the feral ghouls from the upper levels and then proceeded down in the basement and connected cavern, ensuring a daisy-chain of their fellows kept a complete relay chain, so Lily could maintain contact with the whole cohort.

The sweep and clear order wouldn't kill entities if they weren't considered a threat or if they didn't attack, so she supposed every one of the cultist ghouls in the cavern attacked on sight, just like the game. If there was anything else down there, the robots couldn't detect it, and they could detect a lot more than even she could.

Twenty minutes after they entered the building they all filed out. She glanced at the Spider Company men and women, "Okay, I'm going to go in alone."

That caused Wilson to fidget, "Uh... I'm not sure..." Lily waved him off, "I'm going to go in alone." She said, firmer this time.

"Yes, ma'am. Estimate before you're back up?" he said, getting with the program.

Lily hummed and shrugged slightly, "I don't know. Either not very long or most of the day. If it's zhe latter, I'll try to come up and let you know."

She entered the building, triggering her HUD to display the shortest path to reach the caverns below and started walking there. In the game, there were a number of spooky holotapes that told the tragic story of a man named Jaime Palabras in a Lovecraftian descent into madness.

Lily didn't really believe in the supernatural, but she had already seen evidence of psychic and telekinetic powers in this world. In the game, the Dunwich building wasn't dangerous, but Lily wasn't the protagonist of the game; that would be the Lone Wanderer assuming the world recognised him as such. A lot of things that wouldn't touch a protagonist might cinematically kill an NPC like her. Still, she had to find out anyway.

In the game, you saw a number of hallucinations as you passed through the building, but she did not experience anything at all except a vague sense of unease, and she might have been imagining or projecting that herself.

As she walked down the stairs into the basement to reach the caverns her perspective suddenly shifted, and both audible klaxon alarms and visual alerts started peppering her visual field.

*** WARNING! BASILISK HACK PRECAUTIONS! ***

She blinked, suddenly realising that she was hanging upside down from the ceiling on the top floor of the building, and her left-hand tools were deployed, as were her replacement legs that she had installed without telling the Apprentice a few days ago. She had tested them and loved them but hadn't gotten around to scaring Alice with them yet.

The eight spindly little legs felt very comfortable, and they gripped the ceiling through a combination of levitation emitters on the tip of each leg and grabby little claws. She had the battery power to hang like this for hours.

Her plasma cutter tool was almost completely discharged; whatever she had done had used most of the special miniature micro-fusion cell she installed in the arm.

She skittered down across the ceiling and then down the wall to stand on the floor and deployed all of the tools back into her left arm. She'd keep her legs like this for now while she conducted a mental self-assessment. If she needed to run fast, then these legs could run over forty-five kilometres an hour on even rough and uneven ground, like the middle of a half-destroyed building, which was so much faster than two legs ever could.

She didn't know how her basilisk hack precautions were triggered. By definition, she couldn't know. Basilisk hacks were from her old world, and were constructed by the hyper-intelligent TITAN AIs. They were a memetic attack which took advantage of the way biological or even emulated transhuman brains interpreted and processed sensory input. Kind of like the way epileptics was susceptible to certain visual stimuli, all of humanity was susceptible to certain combinations of all senses, although she suspected it was mainly sight and sound. It was kind of a difficult thing to study.

The hyperintelligent AIs knew so much about how the mind worked that they could craft a series of sensory inputs and, through them, deliver payloads directly from the brain. It could be as simple as knocking you out or as insidious as completely rewriting your personality over a period of months. Once you saw and heard the attack, you were pretty much done.

It was so serious that in the past, her standard countermeasure for a potentially detected basilisk hack was self-deletion, followed by restoring herself from a backup. She valued her life now a lot more than she did back then, but she kept a modified version of the precautions on her system.

To be triggered, her computer had to have detected her memories being changed in real-time, somehow. Her Muse continuously monitored her sensorium and compared it to her short-term memory, and in a serious discrepancy, it triggered this failover mode. It rendered her unconscious and took over her body, and it used whatever methods it could to run away until the effect stopped.

When it felt she was as safe as she was likely to get, it saved some data about the incident, including a modified version of her sensorium, and then carefully deleted her short-term memory; then just to be safe, it deleted its own memory about the incident.

The only way she could learn about what happened was if she opened the data packet, and if that triggered the system again, well, the Muse wouldn't leave a report the second time.

Sighing, Lily checked her network connections and sent a message to the men outside that she might be longer than she thought. After that, she opened the file and squinted at her sensorium. The Muse altered it, giving a random distortion to everything she saw, heard, smelled or felt.

The idea was it would take very precise sensory combinations for a basilisk hack to take place, and adjusting them randomly with distortion should cause the entire thing to fail.

She didn't see anything unusual, but for a period of time, she stopped on the stairs and said, "What the fuck?" That was unusual. She wasn't seeing anything on the sensorium, but Lily's past self was clearly reacting to something. The thought track was intentionally missing here, so she didn't know what she was thinking.

After a couple of minutes, she didn't say anything but continued down into the cavern where she found the obelisk, but then immediately she noticed when her Muse took over, deployed her eight legs and plasma torch and sprinted up out of the cavern, skittering across walls and ceiling until she used the plasma cutter to cut through part of the ceiling, and darting up through the hole, repeating this two more times before she settled, hanging off the ceiling of the top floor.

Interesting. She hopped into the air and folded all eight of her legs back into her cybernetic legs, and landed on two feet. She glanced down at herself and hummed. She was just wearing panties now. Well, that was a problem. Her eight mighty legs had destroyed her pants. That was a problem for later, though.

She started playing back the saved memories this time. Her memories and sensorium stream saved to her computer were, ideally, supposed to be identical, but she wouldn't have been in this situation if it was.

Everything started more or less the same, but she started noticing a faint outline, almost like one of her own boxes she uses for her visual graphical user interface for her computer system.

About half of the way down the stairs, it grew solid enough that she could see, and almost the text on it, as if it actually was a graphical interface.

¥̵̛̲ð̶̗̂µ̷͈̅r̸̛̙ ̵̻̂þ̸̳̍å̴͕̌§̸̛̥§̸̱̽ï̸͚̓v̸̠́ê̵̳͊ ̶̫̀þ̸̮̈́È̵͔͋R̵̙͑K̷̰̿ ̵̻͑"̶̖̀Ì̷̡̓ñ̶̹͂v̴̙̆ï̴̡͗ð̴͚̎l̴̺̆å̴̹̎ß̸͍́l̵̰̅ê̷̙͊ ̶͓̒§̵̝͆ð̶̮̄µ̵̛͈l̴͕͌"̵̗͛ ̵̤̍h̵̻̃å̷̩̂§̶̻̈́ ̵̬̒ß̶͇̏ê̴̩̈́ḛ̵̂̒ñ̵̩̔ ̴̈͜å̴͓͘¢̷̯̿†̶͙̑ï̷̭̀v̶̜̇å̷̩̿†̸̱͛ê̵͖̌Ð̵̻̉.̴̻͗͜͝


A few more steps resolved the text until it was clear. It was on a blue background, almost exactly like the user interface to one of the Final Fantasy games on the SNES or NES.

Your passive PERK "Inviolable Soul" has been activated.

Ah, this is where she briefly stopped on the stairs for seemingly no reason. Her thoughts in these memories were still there, which was one reason reviewing this data was dangerous. In the past, past-Lily was thinking pretty much the same thing she was thinking now, namely, "What the fuck?"

Lily suddenly chuckled as both the memories and her present thoughts were almost in sync for a bit. Her memories had a feeling that the underlined segments in that user interface were a link like it was a web page or something insane like that. And her past self tried to "click" on, and the image changed.

perk / pɜrk / (noun): Multiversal travellers generally receive zero or one perk for every universe they visit. Perks are special abilities incorporated into your soul that convey special abilities. The first perk a traveller picks is often the most powerful. Choose wisely!

What... the fuck?

Lily watched her memory "click" on the other link, silently.

INVIOLABLE SOUL
GRADESS+
While your soul can still be destroyed if enough energy is put into the task, it can never again be bound against your will. Even if you voluntarily consent to binding, you may, at any time, dissolve the binding without paying any penalties that would usually be associated with breaking a soul bond/contract, besides possibly angering the wrong entity. Additionally, you receive a minor resistance to biological/digital brainwashing of all types, and any brainwashing will slowly, over time, wear off with minimal to no psychological damage.
+50% SOUL resistance
+25% Mental resistance
Brainwashing must be continually applied to you in order to remain effective. (-3% brainwashing per day.)


Lily's memory had begun walking down into the cavern, and as soon as it saw the obelisk, the memory stopped suddenly.

She was confused, very, very confused. She didn't even believe in souls... well, perhaps she did now. But she definitely wouldn't have before arriving in this strange universe. She certainly didn't remember picking anything. Was this a real thing, or was it a memetic contagion from the Cthulhu spire in the basement? Well, whatever it was, she wasn't coming back here.

Walking down to the ground level, she took a pair of pants from a robot, put them on and walked outside.

"We're going home," Lily said absently, and she was quiet the whole ride back.
 
Last edited:
When you train for sound, you know, objectively, when and what sound they heard. You try to match neural activity to this event. Same for visual data. Startle response? Externally measurable.

To train for emotion, you'd need a way to know when an emotion is happening and what it is, in order to try and match it to brain activity. Unlike the senses, we don't have external data. At best, you have expressions, but that's only good for replicating neural activity that causes expressions, not emotions. A quick trip to a classic AI gotcha, that. Alice could actively try to identify her emotions at all times, but that'd be a hot mess. Unreliable narrator, inconsistent timing, disruptive to the very phenomenon we're trying to observe...

It's not physically impossible, but like I said; orders of magnitude harder to train the AI. Depending on data quality (in this case poor) it can become computationally infeasible. Particularly as she's in Fallout with lesser kit.

And even if we somehow manage it, it's entirely possible those neurons do something different in Lily's brain. You replay "Alice happy" in Lily and she feels nauseous or something. By the very nature of the brain as a neural network, it's kind of variable. That's why Alice had to train sound and vision separately - because they could not rely on the neurons for that in Lily being the same in Alice.
Could Lily use Gary to train it somehow since he's apparently an empath? Maybe give him a brain computer and have him match expressions to emotions using his power to provide a data set for an expert system to work off? Although the very existence of Gary's ability seems to imply that there is some kind of standard emotional energy to detect and it's not significantly different between people.

I'm not trying to brainstorm a way to give Lily a regular emotional range though, that's a seperate issue to her lack of empathy or ability to read social cues. I think changing her emotional range would actually constitute changing her personality where simply giving her extra senses or ways to interpret her sensory data wouldn't.

I'd also forgotten about this but the charisma boosting implant in New Vegas is literally called an Empathy Synthesizer.
 
Last edited:
Back
Top