Scientia Weaponizes The Future

I gotta say, I really disagree that being reprogrammed is a fate worse than death. Your programming changes a little bit every day, that's why you cringe when thinking about dumb shit you did years ago. People change and that is perfectly normal.

Reprogramming just shotcuts that process.
I think the issue is both the shock factor and agency. While we do change day by day, having core values rewritten by an external force is far closer to experiencing a personality altering brain injury rather than a steady progression. Whether this is tolerable depends very much on the subject's philosophy and view on the continuity of consciousness, but to add further fuel to the existential fire the change isn't directed by their own progression of ideas but the whims of another. This seriously undermines any sense of self and self-determination, even if it is just a single point change and they're left to develop as they will from then on. For a small alteration that might be tolerable or even something approached voluntarily, but fundamentally rewriting core values is practically building a new personality in place of the old. If you're going that far then why bother with giving them bad memories? Why bother with the psychosurgery at all? Skip to the creating a new intelligence from a blank slate.

To fill out with detail, Scientia's industrial base consists of nanoassemblers (zero-g and planetary varieties, and various sizes), drones and other things that are necessary for collecting resources in TW Hydrae, refineries for smelting down collected material and separating out elements, drones and robotics for assembling things that are larger than her nanoassemblers out of parts, and the particle accelerators that manufacture antimatter and exotic matter with custom properties.
I don't know why but suddenly I'm hearing the Dyson Sphere Program's main theme:

View: https://youtu.be/x5BhftRRsWE
 
Last edited:
having core values rewritten by an external force is far closer to experiencing a personality altering brain injury rather than a steady progression.

Okay, but can we agree that a brain injury that merely changes your personality (rather than affecting your other abilities or quality of life) isn't a fate worse than death?
 
I gotta say, I really disagree that being reprogrammed is a fate worse than death. Your programming changes a little bit every day, that's why you cringe when thinking about dumb shit you did years ago. People change and that is perfectly normal.

Reprogramming just shotcuts that process.
I dunno about worse than death, but 'no better than death' seems a pretty easy case to make for a hostile reprogramming.
Okay, but can we agree that a brain injury that merely changes your personality (rather than affecting your other abilities or quality of life) isn't a fate worse than death?
It might be a fate to avoid as strenuously as death, if you were somehow able to predict the consequences and found them sufficiently objectionable.
 
Last edited:
I dunno about worse than death, but 'no better than death' seems a pretty easy case to make for a hostile reprogramming.

It might be a fate to avoid as strenuously as death, if you were somehow able to predict the consequences and found them sufficiently objectionable.

I dunno, I'm personally very glad I didn't die when I got hit by a car on my bike without a helmet. Sure I lost 6-ish weeks of memory and rehabilitation was a pain and my aphantasia is gone, but it's better than being dead.

At least give the Blasphemies a choice. They might be weird freaks like me and not see reprogramming as such a terrible thing.
 
Do you think we will see a baby boom among the Origin civilization as they develop lore of a presence in Dimension-Bet?
 
I dunno, I'm personally very glad I didn't die when I got hit by a car on my bike without a helmet. Sure I lost 6-ish weeks of memory and rehabilitation was a pain and my aphantasia is gone, but it's better than being dead.

At least give the Blasphemies a choice. They might be weird freaks like me and not see reprogramming as such a terrible thing.
Looking backwards is not asking the same question...though I rather doubt any of the things you mentioned are ones anybody would be likely to consider worse than death prospectively either, which just makes the apparent argument more confusing.
 
Okay, but can we agree that a brain injury that merely changes your personality (rather than affecting your other abilities or quality of life) isn't a fate worse than death?
It's not a binary, it's a spectrum. There are certain personality changes that could be considered minor or even positive to the individual affected, but there are also changes that could be considered death of the original personality. Where that line is and what changes would push one over it depends heavily on the individual.
 
I rather doubt any of the things you mentioned are ones anybody would be likely to consider worse than death prospectively either

That's my point? I experienced personality change (not having aphantasia anymore is a fuckin trip) and there's a significant gap in my continuity of memory, both are things that freak people out because they're supposedly like death.

Hell I came out as trans after that, and didn't really realize I was before. Did the head injury do that? Proooobably not? I'll never know for sure!

Still glad I survived the accident. If you'd asked me before I think I would have chosen survival. Probably.

It's not a binary, it's a spectrum. There are certain personality changes that could be considered minor or even positive to the individual affected, but there are also changes that could be considered death of the original personality. Where that line is and what changes would push one over it depends heavily on the individual.

Fair. From my perspective, death is basically the worst possible thing. Trauma and tragedy are fleeting and will end eventually, but death is the eternal end.

... well okay I guess if you were an undying immortal and fell into the gravity well of a black hole that's worse than death. But it'd really have to be something like that for me to think death is preferable. Things that are temporary are usually trumped by things that are permanent.

But I suppose it is down to the individual, like y'all are saying.

It doesn't sound like they're going to be asked, though.
 
That's my point? I experienced personality change (not having aphantasia anymore is a fuckin trip) and there's a significant gap in my continuity of memory, both are things that freak people out because they're supposedly like death.

Hell I came out as trans after that, and didn't really realize I was before. Did the head injury do that? Proooobably not? I'll never know for sure!

Still glad I survived the accident. If you'd asked me before I think I would have chosen survival. Probably.
Even for people who see it that way 'like death' and 'worse than death' are pointedly different categories, I think.

Admittedly, the part of it that Dragon is calling out as worse than death (remembering being a monster) is not something I'd likely consider worse than death - especially since they'd be rather morally obligated to provide further help if the Blasphemies didn't want to live with their memories.
Fair. From my perspective, death is basically the worst possible thing. Trauma and tragedy are fleeting and will end eventually, but death is the eternal end.

... well okay I guess if you were an undying immortal and fell into the gravity well of a black hole that's worse than death. But it'd really have to be something like that for me to think death is preferable. Things that are temporary are usually trumped by things that are permanent.

But I suppose it is down to the individual, like y'all are saying.

It doesn't sound like they're going to be asked, though.
If the reprogram was temporary in the sense that there was a plausible possibility of reversal, they couldn't even consider the option - the Blasphemies relapsing is not really an ok outcome.
 
If the reprogram was temporary in the sense that there was a plausible possibility of reversal

Oh, no, I mean they'd get over It instead of be traumatized by their tragic pasts and their violated sense of self and their sense of loss forever. Those things are temporary.

It's like if you lose a limb. You won't grow it back so it not temporary in that sense, but most people wouldn't consider it worse than death because that's something you can learn to live with.
 
Last edited:
Oh, no, I mean they'd get over It instead of be traumatized by their tragic pasts and their violated sense of self and their lost personalities forever. Those things are temporary.

It's like if you lose a limb. You won't grow it back so it not temporary in that sense, but most people wouldn't consider it worse than death because that's something you can learn to live with.
I think the argument is that a sufficiently advanced change like this, a complete and total reversal of your utility function to value human life and kindness and all that good shit when previously you were just an extremely dedicated murderbot... whether you think that the intelligence that existed before and the one that existed after are the same person in any material sense is basically irrelevant I think. At the end of the day you have basically manufactured three entirely new intelligences, and then saddled them with a bunch of awful memories that they perceive as being their own for no particular reason. It seems like a cruel thing to do. Manufacturing new intelligences from scratch seems like a much kinder option.

There's nothing really to be gained from recycling an existing AI to such an extent. The being that they were before won't thank you, and the being that they become afterward didn't exist beforehand anyway, so there's nothing to bias you in favor of their existence over that of a completely new intelligence anyway.

When you talk about being in an accident and experiencing a change along those lines yourself, you have to consider that you're thinking of it from the perspective of the person who has already experienced that change. If you're truly that dissimilar from how you were beforehand, then it's only human sentiment that says your current existence deserves to live whereas the person you were before doesn't. Obviously it happened so you, now existing, just roll with the punches as best you can. Nobody is saying that a newly modified person should commit suicide out of misplaced sentiment for the person they used to be who might have preferred death over such a change. Instead, it's that divorced of merely human concerns about how difficult it is to make and modify people, there doesn't seem to be any compelling reason to create three effectively new minds to put them into the shells once inhabited by the Blasphemies. It's such a radical alteration as to be effectively "Creating new people" so... where's the benefit?
 
Oh, no, I mean they'd get over It instead of be traumatized by their tragic pasts and their violated sense of self and their sense of loss forever. Those things are temporary.

It's like if you lose a limb. You won't grow it back so it not temporary in that sense, but most people wouldn't consider it worse than death because that's something you can learn to live with.
The thing is that's the retrospective perspective. Yeah, if the remoralized AIs exist, that's probably what happens to them eventually.

But the only reason to do it that way would be if somehow doing so is an act of kindness towards the pre-remoralized Blasphemies. And from their perspective they're looking at permanent and catastrophic damage to core aspects of their self. The analogy isn't losing an arm, it's being irreversibly brainwashed into a Nazi. Being forcibly reforged into something that you hate. (Now. That you might not hate it after doesn't make the prospect less horrifying.) That is the fate worse than death here.

So, yeah, all you'd be doing is unnecessary cruelty to them before you rewrite them and then forcing the post-rewrite versions to greet the world with a whole lot of baggage.
 
It's such a radical alteration as to be effectively "Creating new people" so... where's the benefit?

I guess that depends on whether the Blasphemies have anything worth keeping. We know their teamwork is impeccable, they have loads of useful combat experience to understand the breadth and depth of their abilities, and they might have rich comradery and complex inner lives. All of that is gone if you just scrap them.

Also, I think sentimentality is its own benefit. The civilization that "Taylor" gets knowledge from seemed to believe that, based on the glimpses we saw of their culture. It shouldn't be dismissed.

[QUOTE="Aineko, post: 23153606, member: 68700]
But the only reason to do it that way would be if somehow doing so is an act of kindness towards the pre-remoralized Blasphemies
[/QUOTE]

Well they're also useful.

It also sets precedent for future executions/kill orders, which has societal implications. EDIT Actually, this probably needs input from society as a whole somehow. The implications are too wide reaching to be only in the hands of their burgeoning shadow cabal. They're basically setting the precedent for death sentence policy.

All that said? I'm a materialist. If it isn't materially possible to ask them what they would prefer, or capture and hold them until such a time that it is, then they need to be destroyed before they kill anyone else.

Also @Aineko so I'll just ping you manually I guess lol
 
Last edited:
I guess that depends on whether the Blasphemies have anything worth keeping. We know their teamwork is impeccable, they have loads of useful combat experience to understand the breadth and depth of their abilities, and they might have rich comradery and complex inner lives. All of that is gone if you just scrap them.
I mean, they might. That's something they could actually check for I bet, presuming they could capture them or otherwise observe them enough to analyze them in advance. It might be that there's enough of a person-like intelligence in there to make them into people and have something approaching continuity from one entity to the other. Not like "moment to moment" continuity, but enough to point at and say "Yeah, we changed what they wanted, but they still have the same background and experiences." That might be worth preserving.

On the other hand, if it's just combat data and there's nothing resembling a person in there and it's just EVIL SKYNET, throw them in the shredder and toss the raw combat data into a better designed AI that has some moral principles that aren't wholly incompatible with the existence of other life.
 
I gotta say, I really disagree that being reprogrammed is a fate worse than death. Your programming changes a little bit every day, that's why you cringe when thinking about dumb shit you did years ago. People change and that is perfectly normal.

Reprogramming just shotcuts that process.

It's the lack of consent that changes it from growth to mental rape. It's not at all hard to understand why changing a person into someone else, a personality death if you will, is worse than actual death. At least with actual death there isn't any trauma with the intersection of the memories of who you were and the current you that you were forced to become.
 
It's not a binary, it's a spectrum. There are certain personality changes that could be considered minor or even positive to the individual affected, but there are also changes that could be considered death of the original personality. Where that line is and what changes would push one over it depends heavily on the individual.
ya see the thing is I'd love to have something like this happen to me, because my personality sucks, but I'm too lazy to fix it
 
It's the lack of consent that changes it from growth to mental rape. It's not at all hard to understand why changing a person into someone else, a personality death if you will, is worse than actual death. At least with actual death there isn't any trauma with the intersection of the memories of who you were and the current you that you were forced to become.

The lack of consent is an issue, but that doesn't change it into personality death. The implication of your stance is that personal growth is, like, personality suicide or something.

I think personality death would be more like a mind wipe and total reprogramming. There's definitely some line between "mostly the same with a few changes" and "so completely and utterly different they are totally new people" where it becomes personality death, but an attempt to find that line could be made before concluding death is the only choice.

Also? It really doesn't sound like anyone is bothering to ask the Blasphemies what they want. Consent is an issue, so how about someone asks them "would you prefer a mercy kill instead of reprogramming?"

Obviously asking them would be difficult, but if they are intelligent it might just be possible to communicate in some way... Or hack their brains and determine what they want that way, I guess, but that'll freak y'all out too I imagine.
 
Last edited:
The lack of consent is an issue, but that doesn't change it into personality death. The implication of your stance is that personal growth is, like, personality suicide or something.

I think personality death would be more like a mind wipe and total reprogramming. There's definitely some line between "mostly the same with a few changes" and "so completely and utterly different they are totally new people" where it becomes personality death, but an attempt to find that line could be made before concluding death is the only choice.

Also? It really doesn't sound like anyone is bothering to ask the Blasphemies what they want. Consent is an issue, so how about someone asks them "would you prefer a mercy kill instead of reprogramming?"

Obviously asking them would be difficult, but if they are intelligent it might just be possible to communicate in some way... Or hack their brains and determine what they want that way, I guess, but that'll freak y'all out too I imagine.

In this case it's about shifting someone moral compass to an incredible degree.

Imagine that someone who loves kittens and hates all the flying vermin and will do anything to protect them is imprisoned by an enviremental group that hates cats, because they love endangered birds. Birdlovers then the Catlover that it's either death or being brainwashed into to hating all cats and love those cute little chicks.

What would be preferable from the Catlovers perspective, dying or being brainwashed into to throwing sacks filled with kittens into the sea with no small amount of glee.
 
What would be preferable from the Catlovers perspective

This assumes three things.

1. The catlovers have absolutely nothing else going on in their heads besides loving cats and hating birds; no will to live, no other likes or dislikes, no friends or hobbies, no inner lives, absolutely nothing that exists for any purpose outside their two directives.

2. The catlovers can't be made useful by the birdlovers due to their knowledge of cats. Presumably they would be very good at rounding up cats and protecting birds from cats. Just imagine, everyone finally working together!

3. It is absolutely impossible to ask them if they prefer to die instead of continuing on as birdlovers, and so we can only guess. What if they want to live more than they want to protect cats or kill birds? Communication!

By the way, why on Earth did you decide to abstract this into a metaphor?
 
This assumes three things.

1. The catlovers have absolutely nothing else going on in their heads besides loving cats and hating birds; no will to live, no other likes or dislikes, no friends or hobbies, no inner lives, absolutely nothing that exists for any purpose outside their two directives.

2. The catlovers can't be made useful by the birdlovers due to their knowledge of cats. Presumably they would be very good at rounding up cats and protecting birds from cats. Just imagine, everyone finally working together!

3. It is absolutely impossible to ask them if they prefer to die instead of continuing on as birdlovers, and so we can only guess. What if they want to live more than they want to protect cats or kill birds? Communication!

By the way, why on Earth did you decide to abstract this into a metaphor?
Abstraction allows to ignore the trees and look at the forest. We disagree on whether brainwashing serial killers is good. But I was hoping to demonstrate why people disagree by taking a more watered down example.

Values are quite fundamental to whom we are. We build our social circles based on them and ally against those who oppose those values and inner lives are influenced that removing them would be turning them someone into a husk that won't be motivated in anything, because it destroys the inner processes and the reason to make connections with the world. If a new mind grows from the material that held the previous mind it would similair in my eyes to killing someone, turning their remains into fertilizer for something new.

Going one step further and inverting those values would thus be similar to killing them, after which you go one step further and place a new entity into their head that wants to oppose the things they hold most dear.

You might think that killing a serial killer and turning them into food so that new entity may grow in their place might be a good thing, it's not like they will use their body anymore. And that might be on the line.
But imagine grabbing all those memories of the serial killer and putting them in a baby who is compelled to hate murder and everything that serial killer has done and everything telling them that they did all that.
It would be cruel to the new entity, who has for all intents and purposes just come into existence.
 
I just want to say that I love that the story is provoking discussion about deep philosophical questions, and also that you guys are debating things with respect and kindness towards one another.

I adore and treasure all of you.
 
Last edited:
I have come to a conclusion.
There are only two ways I would ever feel accepting of this event.

1. Have a person I trust outside of the parameters of the simulation make drastic changes to my personality manually. Or just targeted brain damage, the risks and downsides are greater but acceptable for me.

2. Put my affairs in order and "format my own hard drive"

No moral decision can be rendered without being massively influenced by the people running the simulation. I have a very strong dislike of being controlled and things like determinism. Thus I would not trust myself until I have been changed drastically outside of controlled parameters.
 
Its funny, how almost all of the arguments I see for why people dislike determinism feel... Wrong to me. Where other people find determinism to rob them of free will, I feel like it makes free will more free. If an identical starting point plus a set amount of time always reaches an identical end result, then your decisions (barring outside influence) are always truly yours.

In a random universe, when faced with an identical decision infinite times, you will take both options, no matter what the decision you face. It could be 50/50, 1 in a million, or one in a billion billion tries, but you will take both options eventually. That's scary to me, in a way that determinism simply can't be.
 
The implication of your stance is that personal growth is, like, personality suicide or something.

In some ways it kinda is, but that's not necessarily a bad thing? We kill parts of ourselves that we don't like. Push them down and deny them so that we can be better, do better, become more than who we are.

And that's hard to do, because it takes a lot of courage to look at yourself and accept that something you think and do is bad. Not worth keeping around.

Forcing that on someone is -to my mind- utterly nightmarish. A monstrous act. If I were given the choice between being rewritten on such a fundamental level and death, I would choose to die, because I don't want to be a different me wearing my skin. It is viscerally horrifying to contemplate.

Its funny, how almost all of the arguments I see for why people dislike determinism feel... Wrong to me. Where other people find determinism to rob them of free will, I feel like it makes free will more free. If an identical starting point plus a set amount of time always reaches an identical end result, then your decisions (barring outside influence) are always truly yours.

The problem with this is that if an identical starting point plus a set amount of time always reaches an identical end result then you did not make any decisions at all. You thought you did, but it was all a petty illusion, a lie.

Personally I consider determinism to be a form of high cowardice. A way for people to look at their choices and say 'it can't be helped,' It implies that no one is responsible for anything they've ever done because they could never have done anything else.
 
The problem with this is that if an identical starting point plus a set amount of time always reaches an identical end result then you did not make any decisions at all. You thought you did, but it was all a petty illusion, a lie.

Personally I consider determinism to be a form of high cowardice. A way for people to look at their choices and say 'it can't be helped,' It implies that no one is responsible for anything they've ever done because they could never have done anything else.

Funny that you should call a deterministic reality "a petty illusion" given what scientia is currently going through. I typed out several different takes on why I don't believe that's true before realizing that I think the better point is that... I don't think it matters if reality is an illusion, because it's still real to me.

Secondly, I do somewhat agree with you. Determinism is an easy refuge for cowards, but I also think that they are wrong. If they are aware enough to make the argument that 'fate forced their hand' then they are aware enough to have been capable of self reflecting and taking a better path. I believe that the absence of errors makes that personal choice just that much more impactful.

I think it's kind of funny how opposed we are on the topic of determinism, given how strongly I agree with your stance on personality death.
 
Back
Top