...Who Needs Enemies? [AltPower!Taylor / Worm]

On the whole cannibal Lisa thing I think it's just the shards haven't learned the difference between what Lisa would do and what she would think about doing before dismissing it due to human decency so quickly she would never know she thought to do it.
 
Medical regulations are the way they are because we are (for now) completely dependent on our brains. The point I made in my first post is that what matters is software, not hardware. If you are uploaded to an artificial computer, then the fact that your thoughts are running on silicon instead of carbon is irrelevant. If someone loses all their memories and patterns of behavior to amnesia, but the brain is intact, then that person is still gone. The fact that the hardware their thoughts used to run on is still fine doesn't matter. Since what matters is the software, then when your software stops (or at least stops functioning in a fashion that we would call a person's thoughts) that can be usefully called a form of death. When I sleep my software (consciousness, coherent thought) cease. When I lose a few minutes of memory to a concussion, that's a form of dying. Not nearly as severe a death as if I lost all my memories, but there's no hard line in the sand that can be drawn between them.

...What the fuck are you talking about brain uploads for? You understand that we have no real idea if that will even be possible yet? Let alone how it might work if it does. This is exactly the kind of thing I'm sick of, people pulling shit out of thin air to argue real life like it was sci-fi.

More importantly, this is simply you trying to shift the goal posts. The claim was that sleeping is equivalent to dying, which is horse shit. Stop dancing around and just admit it.

Oh, and this bit:

When I sleep my software (consciousness, coherent thought) cease.

Is a bit...I mean do you consider your computer's OS software to be ceasing when you change from typing out a document to watching a movie?
 
...What the fuck are you talking about brain uploads for? You understand that we have no real idea if that will even be possible yet? Let alone how it might work if it does. This is exactly the kind of thing I'm sick of, people pulling shit out of thin air to argue real life like it was sci-fi.
Principle of computational equivalence. Absolutely fundamental, and in no way sci-fi or controversial.
More importantly, this is simply you trying to shift the goal posts. The claim was that sleeping is equivalent to dying, which is horse shit. Stop dancing around and just admit it.
I just addressed this directly, I'm not seeing much point to this if you are just going to ignore and insult me. Reread my previous posts, because they've already addressed everything on topic. Specifically why having Lisa's head blown apart, and then nearly perfectly rebuilt is death in an equivalent way to falling asleep or getting a concussion.
 
Last edited:
Principle of computational equivalence. Absolutely fundamental, and in no way sci-fi or controversial.

Really? It's not controversial? Which neuroscientists are you talking to? Actually I'll ask what scientists in general, because my experience has been a little different from what yours seems to have been if you don't think that computational equivalence is controversial in mainstream scientific thought.

Not to mention that it still doesn't matter because the comparison you tried to draw didn't make sense even if we were to accept PCE as a premise.

I just addressed this directly, I'm not seeing much point to this if you are just going to ignore and insult me. Reread my previous posts, because they've already addressed everything on topic. Specifically why having Lisa's head blown apart, and then nearly perfectly rebuilt is death in an equivalent way to falling asleep or getting a concussion.

No you haven't. Not to mention that if you read my own posts you'd see that my initiating objection was specifically to conflating death with sleep in any case. Which you then disagreed with
 
Really? It's not controversial? Which neuroscientists are you talking to? Actually I'll ask what scientists in general, because my experience has been a little different from what yours seems to have been if you don't think that computational equivalence is controversial in mainstream scientific thought.

Not to mention that it still doesn't matter because the comparison you tried to draw didn't make sense even if we were to accept PCE as a premise.
Alright, apparently 'principle of computational equivalence' is a specifically wolfram term, related, but not exactly what I meant (and makes stronger/unclear claims than are widely accepted). The principle I'm referring to is that if you run a program on one machine (of sufficient strength - you can't run a halting problem on a turing machine), then you can run that same calculation on any other (sufficient) machine and get the exact same result. This has been proven and has been demonstrated for an enormous number of machines, from soap bubbles to ball bearings. Of course you haven't heard of neuroscientists using this, because it's most relevant to computer scientists and mathematicians. So if you are claiming that the brain is stronger than a Turing machine, then you are making a really strange and unsupported claim, but even then you aren't out of hot water, because that just means you need an oracle computer or whatever is equivalent, and the existence of the brain shows that that is possible.
 
Last edited:
...

I think that the MOST relevant point here is FUCKING CLONE ARGUMENT.

Jesus titty fucking Christ, what does a debate on the morality of whether or not a clone is the same as the original have to do with the story?

There is a potential clone in it. Why does it matter if it is the same as the original or not?

More importantly, how many more pages of people WHO AREN'T EVER GOING TO CHANGE THEIR MINDS no matter what is said am I going to have to skip through?
A lot. That's pretty much every argument ever.
 
So...how about we talk about the poor person who is going have a horrible day when the Endbringers come to get that sandwich they were going to eat.

Like one person in an Endbringer shelter unwrapping a sandwich to eat when suddenly the top rips open to reveal all three of them gazing down.

No one talks, or even breaths honestly, while the sandwich is slowly lifted into the air and taken back to Taylor.
 
So...how about we talk about the poor person who is going have a horrible day when the Endbringers come to get that sandwich they were going to eat.

Like one person in an Endbringer shelter unwrapping a sandwich to eat when suddenly the top rips open to reveal all three of them gazing down.

No one talks, or even breaths honestly, while the sandwich is slowly lifted into the air and taken back to Taylor.
Obviously that sandwich would have triggered with a power to stop the Endbringers, and was the real target of the attack.

I mean, imagine everyone's reaction when they vanish after taking said sandwich. Just a lot of wtf all around.
 
Last edited:
Alright, apparently 'principle of computational equivalence' is a specifically wolfram term, related, but not exactly what I meant (and makes stronger/unclear claims than are widely accepted). The principle I'm referring to is that if you run a program on one machine (of sufficient strength - you can't run a halting problem on a turing machine), then you can run that same calculation on any other (sufficient) machine and get the exact same result. This has been proven and has been demonstrated for an enormous number of machines, from soap bubbles to ball bearings. Of course you haven't heard of neuroscientists using this, because it's most relevant to computer scientists and mathematicians.

Okay, that makes more sense then what I took you to mean. Certainly I agree with this, though it doesn't mean brain uploads are possible and it certainly doesn't make discussing the theory of them possible in more than the loosest terms.

So if you are claiming that the brain is stronger than a Turing machine, then you are making a really strange and unsupported claim, but even then you aren't out of hot water, because that just means you need an oracle computer or whatever is equivalent, and the existence of the brain shows that that is possible.

No. I'm claiming that shutting down a program is fundamentally different from changing what it's doing.

Sleep is not a state of inactivity in the brain so comparing it to death, which is very much defined by inactivity, is simply wrong.
 
Okay, that makes more sense then what I took you to mean. Certainly I agree with this, though it doesn't mean brain uploads are possible and it certainly doesn't make discussing the theory of them possible in more than the loosest terms.
It doesn't matter if we don't have a mechanism to do a brain upload; we can talk about the implications of them nonetheless. We know the result would be equivalent to the person running on an organic brain. Just like we can't accelerate a ship to .9c, but can perform meaningful calculations and thought experiments about what would happen if we did.

No. I'm claiming that shutting down a program is fundamentally different from changing what it's doing.

Sleep is not a state of inactivity in the brain so comparing it to death, which is very much defined by inactivity, is simply wrong.
Shutting down a program is just making a very large change in what it is doing. Just like if a person's brain was completely wiped, but still functional, that person would be just as dead as if the brain was destroyed. Inactivity in the brain is not the real issue, it's the software that makes you a person stopping or being degraded. If someone had brain damage that put them in a state equivalent to sleep permanently, then they would be lacking much of what makes us a person. They probably wouldn't qualify as sapient or sentient, and hence died (though in a lesser sense than being completely wiped). The fact that sleep isn't permanent makes it the same as the person whose brain was destroyed, and then restored.
 
Alert: Please upload this tangent to another thread
please upload this tangent to another thread The philosophy of self and how that applies to mind-copying is fascinating, but has gotten a bit off-topic for this thread. I mean, it's entirely possible that a work could touch upon those concepts and thus discussion of them within the context of the work could be on-topic, but in this case it seems to have gotten a bit general. Feel free to make a new thread to discuss it further if you'd like to, though!

To the people who posted in exasperation at the tangent; whilst I appreciate your sentiment, ironically what this actually does is make the derail larger and more disruptive; simply by definition. If you feel a like of discussion has wandered from the topic of the thread, just report a relevant post or two and explain your concerns.

I'm going to un-Clockblock the thread now.

Thanks for your time, and enjoy your discussion!
 
i quite like the image of a sandwich floating through the town as though it is a fixed point and the earth is rotating beneath it:

If the sandwich encounters a wall then it is the wall that gives way, without interacting with the sandwich (held together via TK, natch) whatsoever.
 
Wait... Coil's timeline where he's killed unsuspectingly by Lisa ends, but if he wasn't doing something substantially different in the other timeline assuming he was in the room still on the computer still, unless Lisa was several random seconds slower in that second timeline why didn't he just pretty much instantly die in both. This is confusing to me. I mean this infers that he spun around and shot at the dead girl in one timeline not knowing she was reanimated, while managing a perfect heart shot on someone he didn't know was there, and in the other kept working on his computer. Why the hell was he working on his computer if he thought there was a dead girl reanimating behind him?!

Edit: wrong word
 
Back
Top