[X] "How complex does a being need to be to have a soul?"
[X] "Could a complex enough computer have a soul, since brains are basically that?"
[X] "What would happen to a living person if they lost their soul, but their body was undamaged?"
Oh, I can answer 3 just as a matter of course, that's pretty basic knowledge right there;
What would happen to a living person if they lost their soul, but their body was undamaged?
That is what is generally considered a
'massively crippling medical catastrophe.'
There are Grimm and other things that will, in essence, crack your
you open and drink your soul like a good cherry stout. Doesn't damage the body in any way- you'll still end up very, very dead, or at best, in a permanent vegetative coma. You have been
warned several times about these kinds of Grimm throughout your education, but you've never really looked into them.
You like not having nightmares.
I do want the Transistor to get a soul, but the transistor really does seem sapient without one here.
I chose the question I did because it seems to fit them best, but the questions I really want to ask are "if the transistor is capable of human, near-human, or above-human intelligence, why doesn't it have a soul?" and "what would having a soul do to change the Transistor's consciousness that would qualify it as "sapient" that it doesn't already have?" If sapience and souls are linked in this world then I'm not sure what the transistor is missing to stop it from already having one. Maybe Blue is just too good at human mimicry and the transistor isn't really alive, but that feels bad to consider because they seem alive and sapient enough that Ludens felt like a huge dick for saying otherwise. Unhealthy, suicidally-loyal sapience, but sapience nonetheless.
Other than all that, I wonder how this opinion points deal is supposed to work, in the end. Like, what would a 5/5 split mean for what his opinion is? Or a 10/10 for that matter, if that's even possible?
You got it right at the end there. The choice to emulate human emotions and thought processes is, for the Transistor,
absolutely just that- a
choice, made by it, because that's what its primary user is most comfortable dealing with. If Jaune were to be concussed tomorrow and change his preference to that of a transcendent god-computer so far beyond him that its very attention is like that of a human to an ant, then Blue would be decommissioned and his programming cannibalised to create something to affect
that personality instead.
By the Transistor's own standards, it does not consider itself to be alive- whether or not that assumption is actually correct is up for debate, of course, but even going by real-life AI theory, it's only assumed that hard AI would need at least an approximation of consciousness as humans think of it because that's the kind of plasticity that generalised intelligence needs to be, well, generalised, and even then it would need to be
true consciousness- not the essentially Chinese Room setup that the Transistor has. It's a
damn good Chinese Room, but it's still something it can just open up and tweak or ignore when it feels like.
Onto your actual question- it's not a weighting thing, it's a progress bar. Once you reach 5 in either option, I will tell you, in as frank and unsugared a manner I can, what following that path eventually entails. As for reaching double 10s, I'll be impressed that you managed to juke past my ability to do basic arithmetic, but honestly, I'd probably just slap down a tiebreaker, they're,
really very mutually exclusive points of view.
Anyway, vote tally! Currently, asking what allows some things to have a soul and others, not, is winning by
juuust under twice as many votes as asking whether or not you need a soul to be healthy and alive. The vote will close tonight, at 8 pm GMT, so about four hours from now, and, the update will be up tomorrow NO DON'T LAUGH I MEAN IT THIS TIME I HAVE FUCK-ALL TO DO BUT TAKE DOWN A CHRISTMAS TREE AND WRITE-