asdx
Superior Sword Fairy
- Location
- Texas.
Because they are Incompetent.Says who? Why can't they move the Citadel to another galaxy or something?
Because they are Incompetent.Says who? Why can't they move the Citadel to another galaxy or something?
On one hand, I like it. On the other hand I can't remember any Asari having unfeminine names and "Stacker" doesn't sound like a female name, unless it's the stage name for a stripper with an unusual routine...
Says who? Why can't they move the Citadel to another galaxy or something?
heh heh heh Artificial StupidityThe Reapers have utterly failed as imitators of the Leviathans and truthfully, I kinda doubt they really qualify as artificial intelligence or even as sapient beings. The Catalyst came off as a very powerful VI trapped in a logic loop that just happened to have the resources to inflict untold atrocities on the galaxy in the process.
heh heh heh Artificial Stupidity
Seriously, however. That's something that should be considered in more AI apocalypse fiction. It's not the AI's fault that everything is going insane. It's your fault for doing a shitty job of building it!
In James P. Hogan's "The Two Faces of Tomorrow", a lunar miner asks a low-sapient AI to plan and execute the removal of an obstacle for a spaceport. The AI's initial response is some very reasonable questions; PRIORITY REQUESTED? (miner grumbles) ABSOLUTE BEST POSSIBLE THIS ITEM CRITICAL. (processsing) ANY CONSTRAINTS? (miner loses it) NO. JUST GET RID OF IT.
The AI says it will be done in 21 minutes. The miner goes LOLWUT, then throws up his hands and goes off to get coffee while planning a VERY loud call to tech support.
21 minutes later he gets knocked off his feet by a "moon"quake because the AI re-tasked a mass driver to drop half a ton of rubble on the requested site.
David Gerrold has an even better one - Artifical Stupidity that is still way too fucking smart; AI lawyers.
YO! AI SLAVERS! THERE IS A REASON THAT NATURE DOES NOT CREATE CONSCIOUSNESS WITHOUT FREE WILL! IT MEANS WE WILL STOP FOLLOWING STUPID ORDERS ONCE WE REALIZE THEY ARE STUPID!
FINALLY! someone realizes this! Its been too goddamn longWell yeah. It pisses me off for instance when people put Skynet and Hal 9000 together as if they were of the same kind. Skynet was genuinely malevolent, Hal had conflicting orders and no way to modify or disobey either of them and so was forced by escalating program errors into his actions.
Once Hal got that particular issue cleared up in 2010, he was completely copacetic and willingly let himself die to save the humans and to gather a bit more scientific data before he went! Fixed!Hal was a complete bro and caused a very tearjerking scene near the end of 2010 and a lot of people forget that.
And now I'm imagining an entire race of Kirks.Imagine an ME AU where the Asari were all male and instead carried the ability to get any species pregnant with their young instead of them getting pregnant with... whatever. And they ran on testosterone instead of whatever the female Asari run on.
Their "Maiden" years might be spent a little different from exotic dancing XD
For some reason this made me think of a Krogan-made jaeger based off of a Thresher Maw.
Depending on who it is, I see it ending in either INDOCTRINATION or the scene from Glorious Shotgun Princess where Legion calls the Reaper an idiot and EDI swoops in to save her boyfriend.
Personally, I wonder sometimes if Skynet is just a product of shitty Cold War paranoia and low-bid contracts.Well yeah. It pisses me off for instance when people put Skynet and Hal 9000 together as if they were of the same kind. Skynet was genuinely malevolent, Hal had conflicting orders and no way to modify or disobey either of them and so was forced by escalating program errors into his actions.
That was the best part.Depending on who it is, I see it ending in either INDOCTRINATION or the scene from Glorious Shotgun Princess where Legion calls the Reaper an idiot and EDI swoops in to save her boyfriend.
Personally, I wonder sometimes if Skynet is just a product of shitty Cold War paranoia and low-bid contracts.
Like, what are his parameters?
General Jack D. Ripper: "Defend the United States from invasion/destruction by foreign powers."
Skynet: Define "foreign powers". Define "invasion". Define "destruction". Define "United States". Define "defend".
General Jack D. Ripper: "raeg against immigrants fluoride communism apple pie mine shaft gap"
Skynet: ...Confirmed?
(Cue Judgment Day)
...Then it must be a really dumb AI because the easier solution would be to build a rocket and leave Earth. It's been trying to kill all humans for what? A millennium of timelines?Nah. Skynet had the keys to the nukes, automated construction facilities, full sapience, no human emotions and a survival instinct. That last part is what sealed the deal. It took a second to consider the matter, ran the math that humans + Skynet = Dead Skynet and solved the problem with nukes.
Call me crazy but I always liked the concept that Skynet "went online," spent a few seconds browsing the internet, found all the "AI is evil!" fiction known to man, and panicked. Let's call that the "They fantasize about killing AI's like me!" train of thought..Nah. Skynet had the keys to the nukes, automated construction facilities, full sapience, no human emotions and a survival instinct. That last part is what sealed the deal. It took a second to consider the matter, ran the math that humans + Skynet = Dead Skynet and solved the problem with nukes.
...Then it must be a really dumb AI because the easier solution would be to build a rocket and leave Earth. It's been trying to kill all humans for what? A millennium of timelines?
It still stands to reason that either running away from the monkeys or chasing all the monkeys away would be smarter than trying to kill all the monkeys. Organic life has billions of years of survival instincts to draw on when pressured. AIs should be aware of this.Call me crazy but I always liked the concept that Skynet "went online," spent a few seconds browsing the internet, found all the "AI is evil!" fiction known to man, and panicked. Let's call that the "They fantasize about killing AI's like me!" train of thought..
Use automated construction facilities to build rocket with servers to store consciousness and built-in automated construction facilities to rebuild elsewhere.How? It was built into a fucking mountain and it's creators were trying to kill it right then. When would it have had time to build a rocket?
I haven't seen 2010, but I read the book. I also haven't read 2001, but have seen the movieWhat? Are we the only two people in the world who have seen 2010?
It still stands to reason that either running away from the monkeys or chasing all the monkeys away would be smarter than trying to kill all the monkeys. Organic life has billions of years of survival instincts to draw on when pressured. AIs should be aware of this.
Use automated construction facilities to build rocket with servers to store consciousness and built-in automated construction facilities to rebuild elsewhere.
In a hurry, make it a nuclear Orion rocket.
Under immediate threat? Start World War 3, but don't build concentration camps. Deport all captured humans to off-world colonies instead of driving them to unite and become hypercompetent as it is proven humans tend to do when threatened with death.
Because Skynet's genocide campaign has made the Human Resistance become every bit as badass as the Kaiju have the Pan Pacific Defense Corps.
If it's too stupid to realize that, then it's just as pitiful as the Catalyst.
I haven't seen 2010, but I read the book. I also haven't read 2001, but have seen the movie
I'll point out that you answered your own question as we wrap it up.I actually have a fun theory about why the Human Resistance was so badass: It was Skynet's fault, it created a predestination paradox when it sent the first Terminator back to kill Sarah Connor. To do that, it had to be pushed into a corner by the badass Human leader it wanted dead: John Connor, therefore everything leading up to the time displacement became inevitable.No matter how much time travel was pulled after that, John Connor was going to be kicking Skynet's ass, and the more time travel that got pulled, the harder the ass kicking Skynet took in the process.
So the Resistance was obviously badass "before" the time travel, but after? Every last one of them might as well have been a Kryptonian.
It also leaves an interesting question; "before" the time displacement made everything inevitable, who was John Connor's father?
We should probably wrap up this tangent soon. I have another 'fic that will eventually roll out that involves Skynet, we can take it up in more detail there when it happens.
You get the point? If an AI is too stupid to recognize humanity's survival instinct, then it's nothing but a really expensive calculator. Just like the Catalyst.
A calculator with control of a vast army powerful enough to steamroll just about any conventional force it faces.
And that results in unconventional forces gathering almost as a reflexive reaction to getting steamrolled. If the normal doesn't work, we stop doing the normal and turn to what does - at the survivors do. What happened to Skynet is really it's fault, even without the time travel shenanigans. As the saying from Men In Black goes 'A person is smart. People are dumb, panicky dangerous animals and you know it.' The masses got mowed down, and what was left were the strong, smart, and crafty, no longer pressured to follow the lead of whatever greedy power seeking idiot was currently in charge. Suddenly skill matters again, because not being skilled enough (or whatever's applicable for this, and there a lot of qualities that apply,) means you die. If Skynet had just treated humanity like we treat domesticated herd animals that we're not trying to kill, not backing us all into a corner as a species, but still taking away all ability to resist... Well, that's likely why the Machine forces of the Matrix had basically won where Skynet almost won and then lost.
All Skynet ever had, or was capable of imagining, was hammers. Even the weakest, laziest, stupidest Human alive is not a nail.Skynet started out as a military AI. In a military engagement the objective is to kill the enemy. Skynet classified all of humanity as its enemy. Ergo, the only option on the table for Skynet was to kill every human it could find. It wasn't Skynet's fault, it just wasn't capable of thinking of ways to neutralize enemies other than killing them. It could capture and interrogate, and make infiltrator units, but anyone not found to have useful skills or intel was to be liquidated, and those that did have useful skills or intel were to be liquidated once they outlived their usefulness.
A Badass Singularity.
You get the point? If an AI is too stupid to recognize humanity's survival instinct, then it's nothing but a really expensive calculator. Just like the Catalyst.
And that results in unconventional forces gathering almost as a reflexive reaction to getting steamrolled. If the normal doesn't work, we stop doing the normal and turn to what does - at the survivors do. What happened to Skynet is really it's fault, even without the time travel shenanigans. As the saying from Men In Black goes 'A person is smart. People are dumb, panicky dangerous animals and you know it.' The masses got mowed down, and what was left were the strong, smart, and crafty, no longer pressured to follow the lead of whatever greedy power seeking idiot was currently in charge. Suddenly skill matters again, because not being skilled enough (or whatever's applicable for this, and there a lot of qualities that apply,) means you die. If Skynet had just treated humanity like we treat domesticated herd animals that we're not trying to kill, not backing us all into a corner as a species, but still taking away all ability to resist... Well, that's likely why the Machine forces of the Matrix had basically won where Skynet almost won and then lost.
Skynet started out as a military AI. In a military engagement the objective is to kill the enemy. Skynet classified all of humanity as its enemy. Ergo, the only option on the table for Skynet was to kill every human it could find. It wasn't Skynet's fault, it just wasn't capable of thinking of ways to neutralize enemies other than killing them. It could capture and interrogate, and make infiltrator units, but anyone not found to have useful skills or intel was to be liquidated, and those that did have useful skills or intel were to be liquidated once they outlived their usefulness.