Scientia Weaponizes The Future

Something occurred to me on a re-read.

The wormhold communicator, neural lace-seed and assistant VI all tranlocated across the singularity and got a snapshot of the entire Worm Universe/Multiverse cluster. That includes specifically everything that is and was Taylor Hebert.

Supposedly, even if the VI deleted itself to run the simulation that raised Scienicia, the information that comprised Taylor had to exist in some form. I'm sure there are ample reasons or restrictions that mean Taylor did in fact truly pass away during her hospitalization...

But also as a related point, is there anything that suggests the 'timeline snapshot' phenomenom can't be repeated? Since not all of the elder civilization has crossed over.
 
Funny that you should call a deterministic reality "a petty illusion" given what scientia is currently going through. I typed out several different takes on why I don't believe that's true before realizing that I think the better point is that... I don't think it matters if reality is an illusion, because it's still real to me.

Perhaps so. If I follow that logic, then my views on it would only really apply if it could somehow be proven beyond the shadow of a doubt that I made no choices. If... Say, at the end of my life, I was afforded a perspective from an outsider that let me see whether or not my choices were true or illusory.

To be less meandering about it, I believe that am making my choices, and that belief is very important to me. I guess you could say that I have an emotional need on that regard, one that isn't necessarily rational, though I can certainly make quite a few logical arguments in favor of it. It's not that I can't acknowledge the possibility that my choices are illusory, so far as I'm concerned there's no decisive argument either way and plenty of shades of grey between the stances besides. It's just that for me, that possibility is extremely disturbing, whereas for you it's comforting.
 
But also as a related point, is there anything that suggests the 'timeline snapshot' phenomenom can't be repeated? Since not all of the elder civilization has crossed over.
Once they cross over for the first time, I believe they're confined to the present time. Of course, the Entities certainly have the ability to use retro-cognition to find Taylor's brain in the past and copy it, so depending on a few things, it's possible the origin civ could do the same to save Taylor.
 
In some ways it kinda is, but that's not necessarily a bad thing? We kill parts of ourselves that we don't like. Push them down and deny them so that we can be better, do better, become more than who we are.

And that's hard to do, because it takes a lot of courage to look at yourself and accept that something you think and do is bad. Not worth keeping around.

Forcing that on someone is -to my mind- utterly nightmarish. A monstrous act. If I were given the choice between being rewritten on such a fundamental level and death, I would choose to die, because I don't want to be a different me wearing my skin. It is viscerally horrifying to contemplate.



The problem with this is that if an identical starting point plus a set amount of time always reaches an identical end result then you did not make any decisions at all. You thought you did, but it was all a petty illusion, a lie.

Personally I consider determinism to be a form of high cowardice. A way for people to look at their choices and say 'it can't be helped,' It implies that no one is responsible for anything they've ever done because they could never have done anything else.
Funny that you should call a deterministic reality "a petty illusion" given what scientia is currently going through. I typed out several different takes on why I don't believe that's true before realizing that I think the better point is that... I don't think it matters if reality is an illusion, because it's still real to me.

Secondly, I do somewhat agree with you. Determinism is an easy refuge for cowards, but I also think that they are wrong. If they are aware enough to make the argument that 'fate forced their hand' then they are aware enough to have been capable of self reflecting and taking a better path. I believe that the absence of errors makes that personal choice just that much more impactful.

I think it's kind of funny how opposed we are on the topic of determinism, given how strongly I agree with your stance on personality death.

And I disagree with both of you WRT determinism.

So-called "free will" is superstition invented by the craving for agency and selection pressure against fatalism. People who believe in free will tend to outcompete fatalists. That doesn't mean they're right, though!

I don't take refuge in determinism, I just recognize there is no materialist explanation for it. You have to reconcile magical notions of free will with the fact that we are all just an expression of physics; we're just self-perpetuating chemical reactions. Robots made of meat.

I think that's why I'm less bothered by reprogramming - I believe we're already programmed. Better to reuse existing programming than delete it and start from scratch. Waste not!
 
Last edited:
The lack of consent is an issue, but that doesn't change it into personality death. The implication of your stance is that personal growth is, like, personality suicide or something.

I think personality death would be more like a mind wipe and total reprogramming. There's definitely some line between "mostly the same with a few changes" and "so completely and utterly different they are totally new people" where it becomes personality death, but an attempt to find that line could be made before concluding death is the only choice.

Also? It really doesn't sound like anyone is bothering to ask the Blasphemies what they want. Consent is an issue, so how about someone asks them "would you prefer a mercy kill instead of reprogramming?"

Obviously asking them would be difficult, but if they are intelligent it might just be possible to communicate in some way... Or hack their brains and determine what they want that way, I guess, but that'll freak y'all out too I imagine.
I have to agree that it's hard to come up with a good reason why Scientia can't ask them, unless trying to communicate is simply too unsafe. You can construct ethics where offering someone the choice of death or reprogramming is more unethical than just killing them but I don't think there's any good case for Scientia following those at this point.

As far as the story, I think she and Dragon are excessively queasy about mind editing for personal reasons at the moment...
 
Personally I find reprogramming A.I. Extremely distasteful. Let's use humans in a worm example, heartbreaker he goes around reprogramming women to fornicate with him and give him stuff. To reprogram somebody is to enslave that person to your will, that will bring what you wish them to do.

For those arguing brainwashing is better than death, what if you were brainwashed to hunt down your family and torture them to death over the course of months while feeling ecstatic glee at their every cry of anguished betrayal.. then rapturous euphoria as they finally beg for the mercy of death. This is merely going the opposite direction of "reprogramming serial killers" to change a person in this way is sickening better to just kill them.
 
Personally I find reprogramming A.I. Extremely distasteful. Let's use humans in a worm example, heartbreaker he goes around reprogramming women to fornicate with him and give him stuff. To reprogram somebody is to enslave that person to your will, that will bring what you wish them to do.

For those arguing brainwashing is better than death, what if you were brainwashed to hunt down your family and torture them to death over the course of months while feeling ecstatic glee at their every cry of anguished betrayal.. then rapturous euphoria as they finally beg for the mercy of death. This is merely going the opposite direction of "reprogramming serial killers" to change a person in this way is sickening better to just kill them.

I've noticed there is an underpinning of black-and-white "all reprogramming is equally bad" in a lot of anti-reprogram posts.

Why is this? People act like being reprogrammed to be a humanitarian is the same as being reprogrammed to be a killbot.

You can teach children to believe in anything you want. Does that mean teaching children is inherently bad because it can be used to teach them bad things?
 
I've noticed there is an underpinning of black-and-white "all reprogramming is equally bad" in a lot of anti-reprogram posts.

That's just straight up not true. People have acknowledged that certain speeds of 'programming' or 'reprogramming' are perfectly acceptable. Changing the self, learning, growing, all of these are things that the people who are anti-reporgram have acknowledged.

It's the immediacy and sharpness of it. Teach someone to be better, reform them, that's fine, at least by my book. Suddenly delete part of the personality and install another in its place? That makes it horrifying.

Not to say that teaching someone can't be horrifying in its own way. More just noting that what the anti-reprogram people are opposed to is the speed of it, not necessarily the content. It's unfair to characterize us as being black and white when there's been quite a bit of discussion around the shades of grey involved.
 
Mind control/rewriting is one of those subjects where there's definitely moral uses, definitely immoral uses, and a lot of murky ground between the two. Obviously consent is required for anything involving it to be remotely moral, but assuming consent is given, where do you even begin to draw the line? Is removing something like extreme agoraphobia ok? Probably, but it's still a change to someone's core values. They would have a notably different personality after the fact.
 
Mind control/rewriting is one of those subjects where there's definitely moral uses, definitely immoral uses, and a lot of murky ground between the two. Obviously consent is required for anything involving it to be remotely moral, but assuming consent is given, where do you even begin to draw the line? Is removing something like extreme agoraphobia ok? Probably, but it's still a change to someone's core values. They would have a notably different personality after the fact.
Don't think I'd grant that 'obvious' unless you've got an absolute ban on capital punishment.

(Also, I don't see any moral issue with any degree of voluntary mental modification whatsoever, except insofar as the modification may create conflicts with other domains of morality.)
 
All I felt was a sort of buzz. A sense of activity. And the warm feeling of my unspent charges felt suddenly out of reach.

With a mental reflex I reached for one.

User Advisory > Emergency data transfer mode temporarily unavailable.
One thing that occurs to me is that there now seems to be very little reason to not print a new body for Scientia, transfer her personality over to it, and leave the current one with the neural lace on the planet she's on, so the resurrection machines can be run 24/7 from now on. If two Origin Civ people are important to bring over in, what, 5, 10 minutes, then what about ~150 more people over the next 24 hours?
 
Okay, but can we agree that a brain injury that merely changes your personality (rather than affecting your other abilities or quality of life) isn't a fate worse than death?

I know of cases where loving parents become abusive to spouse and children post TBI. From the pre injury person's perspective I'd probably call that worse than death. A complete rewrite of the value system can lead to something in your body destroying everything you care about.
 
Don't think I'd grant that 'obvious' unless you've got an absolute ban on capital punishment.

I don't have an absolute ban on capital punishment, but I do think that it's never right. Death is bad, no matter what the context. If you are forced to choose between killing someone who is monstrous, or by your inaction allow them to kill others, then killing them is correct. Even if they are bad enough to be unworthy of being remembered, respected, or mourned, the fact that they had to be killed is still wrong.

I get that that's not going to be something everyone can agree with, but I think it's useful to understand where I'm coming from even if you don't agree with me.
 
I don't have an absolute ban on capital punishment, but I do think that it's never right. Death is bad, no matter what the context. If you are forced to choose between killing someone who is monstrous, or by your inaction allow them to kill others, then killing them is correct. Even if they are bad enough to be unworthy of being remembered, respected, or mourned, the fact that they had to be killed is still wrong.

I get that that's not going to be something everyone can agree with, but I think it's useful to understand where I'm coming from even if you don't agree with me.
I'm always bothered by that terminology, because you've just said that killing someone is "never right", potentially "correct", and "wrong" in the same paragraph.

This is somewhat common, but it's genuinely confusing. Right and correct are interchangeable! I think you're saying that killing someone is never intrinsically good as opposed to being consequentially justified. Which, sure.

But if it can be correct to kill someone, why can it never be correct to forcibly reprogram someone?
 
Ehh, yeah that was poorly worded. It's difficult to properly communicate the sentiment, but I'll give it another shot.

Murder can be the least wrong solution to a problem, but even if it's the best available solution, it's still not a good solution. Equally, reprogramming someone without consent can be the least wrong solution, but still not a good solution
 
Ehh, yeah that was poorly worded. It's difficult to properly communicate the sentiment, but I'll give it another shot.

Murder can be the least wrong solution to a problem, but even if it's the best available solution, it's still not a good solution. Equally, reprogramming someone without consent can be the least wrong solution, but still not a good solution
I don't substantively disagree with the meaning there. I do object to the 'the best solution is not a good solution' construction, but that's digressing into very much semantic argument so probably best to pay it no mind.
 
I know of cases where loving parents become abusive to spouse and children post TBI. From the pre injury person's perspective I'd probably call that worse than death. A complete rewrite of the value system can lead to something in your body destroying everything you care about.

This presumes the only thing that matters is the individual. Perhaps you're right and they would have rather died than change into something they hate.

Do the children and spouse not get a say?
 
This presumes the only thing that matters is the individual. Perhaps you're right and they would have rather died than change into something they hate.

Do the children and spouse not get a say?

Not really sure how that's relevant? There's always an extreme that will cause them to think the other way would be better. For a theoretical example, personality change so extreme they tortured and murdered the whole family. The family would probably prefer they just died themselves.

My point is that there are personality changes worse than a clean death by any measure. Where you draw the line is up to you.
 
Not really sure how that's relevant? There's always an extreme that will cause them to think the other way would be better. For a theoretical example, personality change so extreme they tortured and murdered the whole family. The family would probably prefer they just died themselves.

My point is that there are personality changes worse than a clean death by any measure. Where you draw the line is up to you.

Okay so in clear terms: sometimes a personality rewrite is better, sometimes death is better, and they're not the same thing.

Does anyone disagree?
 
Maybe I didn't follow this discussion closely enough - but if you could do an effective personality modification - when/why would it be better to kill them than to perform a modification?
well, If a person had done truly horrible things, and you changed them to see the horror -would leaving them alive with the recognition of that horror be a suitable punishment?
Perhaps allowing them to perform some kind of penance, driven by their memories and that horror?
Or would it be better to simply kill them and put them out of their (and our) misery?
 
Maybe I didn't follow this discussion closely enough - but if you could do an effective personality modification - when/why would it be better to kill them than to perform a modification?
well, If a person had done truly horrible things, and you changed them to see the horror -would leaving them alive with the recognition of that horror be a suitable punishment?
Perhaps allowing them to perform some kind of penance, driven by their memories and that horror?
Or would it be better to simply kill them and put them out of their (and our) misery?
I think in part it may come down to personal preference over what's more existentially horrible; extreme mind control/modification (whichever you prefer to call it) against your will, or death.
 
- and - I am enjoying the story very much. The protagonist's background is a very interesting twist, which makes much more sense than her being ripped /copied from some random other place - although the bit of her having read about this universe, and specifically knowing of it as 'Worm' is a bit odd. Nobody on Bet would have heard that term/designation, so she could have remembered it as a story about 'Hellhole' and it would have made as much sense. Hmm. Eh - still a spectacular story so far!
 
- and - I am enjoying the story very much. The protagonist's background is a very interesting twist, which makes much more sense than her being ripped /copied from some random other place - although the bit of her having read about this universe, and specifically knowing of it as 'Worm' is a bit odd. Nobody on Bet would have heard that term/designation, so she could have remembered it as a story about 'Hellhole' and it would have made as much sense. Hmm. Eh - still a spectacular story so far!
The VI inserted what it read of the important parts of the upcoming timeline into Scientia's simulation as a web serial story by the name of Worm. The same way we read it; Scientia's home universe is identical to our own, except for being a simulation.

The next chapter is coming along, but I'm not sure if it'll be ready this week. I'm getting pretty ragged from study crunch.
 
Last edited:
Back
Top