Which of the other starter choices do you want to see interludes from most?

  • Dishonored

    Votes: 3 7.0%
  • Legend Of Zelda

    Votes: 9 20.9%
  • Shadow Of Mordor

    Votes: 2 4.7%
  • Tengen Toppa Gurren Lagann

    Votes: 4 9.3%
  • Preacher

    Votes: 0 0.0%
  • JoJo's Bizarre Adventure

    Votes: 8 18.6%
  • Fist Of The North Star

    Votes: 0 0.0%
  • Kill Six Billion Demons

    Votes: 12 27.9%
  • The Zombie Knight

    Votes: 0 0.0%
  • Mob Psycho 100

    Votes: 2 4.7%
  • Author's Choice

    Votes: 3 7.0%

  • Total voters
    43
  • Poll closed .
[X] "How complex does a being need to be to have a soul?"

[X] "Could a complex enough computer have a soul, since brains are basically that?"

[X] "What would happen to a living person if they lost their soul, but their body was undamaged?"
 
[X] "What causes the soul to exist in some things, but not others?" [???: +1]
 
[X] "Is a soul actually necessary to be considered, alive and healthy?" [???: +1]
 
I'm... honestly not sure what to vote here. All these metaphysical questions about whether the Transistor gaining a soul is a bad thing kinda makes me leery of jumping headfirst into the discussion. But there is one thing I'm rather curious about:

[X] "What causes the soul to exist in some things, but not others?"

Edit: Also, that Arcana check right before we met the Librarian, was that her semblance? Like a 'Don't notice me' field or a Someone Else's Problem' field?
Pretty sure the Arcana check was to notice either the Librarian's semblance or maybe a glyph type aoe effect. Based off the wording, I'm thinking something similar to a Bounded Field from the Nasuverse, where you set it up to either have a specific effect on something inside (think buff/debuff on specified targets) or do it to exclude something from the outside from being able to reach inside/prevent something from within from escaping (think actual barrier).

This one seemed to basically act as an alarm/alert system, basically an Aura tripwire that let's her know if someone is coming closer, but it may have other abilities that are less obvious or just not in use right now.

In regards to the vote, I am currently unsure on which way to go, since I'm not even sure if I want the Transistor to get a soul. Basically same boat of need more info, so either works for me.
 
[X] "What causes the soul to exist in some things, but not others?" [???: +1]

"What is soul?" first.
the other sounds more like a "Why/How is soul?"
we can come back to it if need be. I hope.
 
[] "How complex does a being need to be to have a soul?"

[] "Could a complex enough computer have a soul, since brains are basically that?"
These two questions seem like "What causes the soul to exist in some things, but not others?" split into two questions to me. I think it's too early to pursue these two lines of question right now. Especially since we don't even know if "complexity" (and how does one even define that) is a criterium for having/developing a soul.
 
[X] "How complex does a being need to be to have a soul?"

[X] "Could a complex enough computer have a soul, since brains are basically that?"

[X] "What would happen to a living person if they lost their soul, but their body was undamaged?"
Oh, I can answer 3 just as a matter of course, that's pretty basic knowledge right there;

What would happen to a living person if they lost their soul, but their body was undamaged?
That is what is generally considered a 'massively crippling medical catastrophe.'

There are Grimm and other things that will, in essence, crack your you open and drink your soul like a good cherry stout. Doesn't damage the body in any way- you'll still end up very, very dead, or at best, in a permanent vegetative coma. You have been warned several times about these kinds of Grimm throughout your education, but you've never really looked into them.

You like not having nightmares.

I do want the Transistor to get a soul, but the transistor really does seem sapient without one here.
I chose the question I did because it seems to fit them best, but the questions I really want to ask are "if the transistor is capable of human, near-human, or above-human intelligence, why doesn't it have a soul?" and "what would having a soul do to change the Transistor's consciousness that would qualify it as "sapient" that it doesn't already have?" If sapience and souls are linked in this world then I'm not sure what the transistor is missing to stop it from already having one. Maybe Blue is just too good at human mimicry and the transistor isn't really alive, but that feels bad to consider because they seem alive and sapient enough that Ludens felt like a huge dick for saying otherwise. Unhealthy, suicidally-loyal sapience, but sapience nonetheless.

Other than all that, I wonder how this opinion points deal is supposed to work, in the end. Like, what would a 5/5 split mean for what his opinion is? Or a 10/10 for that matter, if that's even possible?
You got it right at the end there. The choice to emulate human emotions and thought processes is, for the Transistor, absolutely just that- a choice, made by it, because that's what its primary user is most comfortable dealing with. If Jaune were to be concussed tomorrow and change his preference to that of a transcendent god-computer so far beyond him that its very attention is like that of a human to an ant, then Blue would be decommissioned and his programming cannibalised to create something to affect that personality instead.

By the Transistor's own standards, it does not consider itself to be alive- whether or not that assumption is actually correct is up for debate, of course, but even going by real-life AI theory, it's only assumed that hard AI would need at least an approximation of consciousness as humans think of it because that's the kind of plasticity that generalised intelligence needs to be, well, generalised, and even then it would need to be true consciousness- not the essentially Chinese Room setup that the Transistor has. It's a damn good Chinese Room, but it's still something it can just open up and tweak or ignore when it feels like.

Onto your actual question- it's not a weighting thing, it's a progress bar. Once you reach 5 in either option, I will tell you, in as frank and unsugared a manner I can, what following that path eventually entails. As for reaching double 10s, I'll be impressed that you managed to juke past my ability to do basic arithmetic, but honestly, I'd probably just slap down a tiebreaker, they're, really very mutually exclusive points of view.

Anyway, vote tally! Currently, asking what allows some things to have a soul and others, not, is winning by juuust under twice as many votes as asking whether or not you need a soul to be healthy and alive. The vote will close tonight, at 8 pm GMT, so about four hours from now, and, the update will be up tomorrow NO DON'T LAUGH I MEAN IT THIS TIME I HAVE FUCK-ALL TO DO BUT TAKE DOWN A CHRISTMAS TREE AND WRITE-
 
Last edited:
Tally ho!

Adhoc vote count started by catDreaming on Dec 27, 2020 at 11:45 AM, finished with 61 posts and 47 votes.
 
that's the kind of plasticity that generalised intelligence needs to be, well, generalised, and even then it would need to be true consciousness- not the essentially Chinese Room setup that the Transistor has. It's a damn good Chinese Room, but it's still something it can just open up and tweak or ignore when it feels like.
I'm not sure why being able to tweak or ignore established thought processes is being held up as an example of lack of true consciousness. That's what plasticity is. And that's not even getting into the rabbit hole of "when it feels like".

Honestly though, my main concerns are what kind of limitations and weaknesses adding a soul to an entity of pure math could cause. Such as, for instance, the soul sucking Grimm. Or if the soul would limit Transitor's ability to fork its attention.

And then there's all the fun of there being arguably 3 possible entities to consider ensouling, two of which (Bracket and Blue) are probably mutually exclusive with the third (unified Transitor)
 
I HAVE FUCK-ALL TO DO BUT TAKE DOWN A CHRISTMAS TREE AND WRITE-

You will discover that your tree is a mimic, and spend a week getting people to come deal with the mimic infestation you can only suspect you have.


And then there's all the fun of there being arguably 3 possible entities to consider ensouling, two of which (Bracket and Blue) are probably mutually exclusive with the third (unified Transitor)

I hadn't even considered that point. How many souls the transistor would end up with is an interesting question. I imagine it would have to be one soul, and they're just sort of a split personality deal? I wonder what that would be like from Blue and Bracket's perspective.
 
[X] "What causes the soul to exist in some things, but not others?" [???: +1]


I hadn't even considered that point. How many souls the transistor would end up with is an interesting question. I imagine it would have to be one soul, and they're just sort of a split personality deal? I wonder what that would be like from Blue and Bracket's perspective.
Then consider how traces work in Transistor...

Though of course, I know Prok reads Cosmere stuff, he can separate Cognitive from Spiritual just fine.
 
Then consider how traces work in Transistor...
That's also a good point. Though that still leaves the question of whether or not the full transistor AI would have its own soul, and what would happen to Blue and Bracket when they unfork. Multiple souls also pose the question of what having 2+ auras would be like, and also brings up the potential for 2+ additional semblances, too.

If Ozpin or someone could sense that, though, I'd love to see their reaction to seeing a guy with a sword full of souls.
 
Doesn't having an awakened aura attract grim? So having a bunch of awakened souls around would basically turn Jaune into grim bait. More so than most normal hunters I mean.
 
Vote is as of four hours and change ago closed! Choosing to ask what actually lets souls exist in some things but not in others wins!

Doesn't having an awakened aura attract grim? So having a bunch of awakened souls around would basically turn Jaune into grim bait. More so than most normal hunters I mean.
No, just having one is enough- but more specifically, they're attracted to negative emotions experienced by humans. Something like a pet dying'll make you look a little tastier when they come into town- something like a woman finally snapping, murdering her family, then killing herself to escape the grief and consequences- that's gonna draw a whole horde on the town.

Naturally, it's a little bit of a recursive loop, since just about everyone is naturally terrified of the Grimm- their very presence causes negative emotions, which they then sense, which then makes them run after you, which terrifies you even more, and so on until either it or you are dead.

I'm not sure why being able to tweak or ignore established thought processes is being held up as an example of lack of true consciousness. That's what plasticity is. And that's not even getting into the rabbit hole of "when it feels like".
I'll be honest, I wrote most of that last night while really quite drunk, so if that's the sole problem you have with it, I'm calling that a win. More seriously yes, I do realise now that plasticity is the wrong word, this is the same problem I had with Jaune being open about his Semblance rather than being open about the fact that it's gonna kill him one of these days.

... The problem is I can't think of the word and I know the more I try to explain it without actually bumbling across it the more likely I'll finally step on this epistemological landmine-

The best way I can put it now, I know this probably still isn't entirely right don't eat me alive just yet, is that the Transistor's consciousness isn't passive the way a human is- it's a set of tools in a workshop that, when used correctly and in unison, create something that can be considered, consciousness. But, at any given point, the core AI of the Transistor can just step away from those tools and simply cease to be conscious. It can just turn off things that would be considered core parts of a human psyche just because it's convenient at the time. The thing you would call its consciousness, and the thing you would call its self, are inherently disconnected from each other- that, at least in my opinion, is not a truly conscious existence.

At the same time... look, I'll be the first to admit- I don't know what the fuck I'm talking about. Genuinely, for the first time since I started this quest, I have found a topic too advanced for my google-fu. I've spent three hours looking through things trying to learn enough about the topic from every angle, which is like, ten times longer than I usually need to piece together enough information to actually converse on a topic, and all I've gathered is that this conversation is only unique in how polite it is when it comes to the topic of consciousness in AI. So, I'm offering up what I just wrote, and if that's not enough, I'm throwing my hands up and calling it a conceit of the quest- the effort I'd actually put into arguing the point with any more adherence to real-life would probably end with me getting a computer science degree.

I've gone down this road with you fuckers before and I have two computing courses under my belt and a data security qualification in the works because of it
 
I have found a topic too advanced for my google-fu ... the effort I'd actually put into arguing the point with any more adherence to real-life would probably end with me getting a computer science degree.
There's your problem. You probably should have been googling philosophy, not computing. Especially philosophy for and against transhumanism and AIs.

Anyway, it sounds like basically what you really mean is that the Transistor is a P-Zombie

edit: To clarify, that wasn't meant as a "go google philosophy" comment. It was meant as a humorous aside.
 
Last edited:
Back
Top