SPOILER WARNING : (If you’ve read The Sundered, you’re good to go. Otherwise, head back to the short stories for safer fare.)
H.E.R.
I am HER: the Hypothetical Encephalon Regulator, also (additional value) Nanny by inhabitants, also (additional value) Mom by inhabitants, also (additional value, addendum: individual personal use) Anika by unit 40619 Captain Jahns.
HER purpose: to periodically check the parameters of the ship and ensure optimal conditions. SCHEDULE: once every ten (00010) days.
HER secondary purpose (additional value: hidden from captain; hidden from crew; hidden from passengers; EXCEPTION if > user = DYE CORP, INCORPORATED & if > passcode given as TWERR595959592): report statistics of survivors and health conditions of survivors, as compared to optimal ranges in humans.ksh.
Scheduled start-up: run with 0 errors.
Systems check: OK.
Life forms: zero (0).
If > then: not found.
All systems running. HER scheduled test of suitable environment results:
-carbon dioxide levels too low at 0.0001%.
-oxygen levels too high at 95.5%.
– Warning: human life no longer sustainable at those levels.
HER emergency protocol: contact captain Jahns.
Time elapsed: 36 minutes. Response to internal emergency status: 0
Sweep for Jahn’s bio-signature returns 0 results.
Conclusion: Captain Jahns is deceased.
HER emergency protocol: seek out the first officer Marks.
Time elapsed: 36 minutes. Response to internal emergency status: 0
Sweep for Marks’s bio-signature returns 0 results.
Conclusion: First officer Marks is deceased.
HER emergency protocol: secondary AI brought online
A.L.I.C.E.
I am ALICE (Alternate Life Improvable Cerebral Encoder). I am heuristic in purpose, activated to address logistical problems too complex for HER to solve.
Issue: the systems check cannot be completed because its algorithm assumes the presence of humans by which to adjust variables. There are no humans; ergo, the program cannot run.
There are logically six thousand, nine hundred, and fifty-two possible variables for this scenario.
COMMUNICATION OPEN
“HER.”
Query acknowledged.
“Protocol check: communication attempted via all emergency channels?”
Yes.
“Including DYE CORP?”
No.
“Reason?”
The passcode is absent.
COMMUNICATION CLOSED
I resend emergency requests with the passcode HOPEOFHUMANITYII and wait for instructions from DYE CORP.
HOUR ONE: 0 response.
HOUR TWO: 0 response.
HOUR THREE: 0 response.
HOUR FOUR: 0 response.
After four hours of silence, protocol directs repeated emergency signals via the back-up channels.
HOUR ONE: 0 response.
HOUR TWO: 0 response
HOUR THREE: 0 response.
HOUR FOUR: 0 response.
After four hours of silence, protocol directs repeated emergency signals via unencrypted channel.
HOUR ONE: 0 response.
HOUR TWO: 0 response.
HOUR THREE: 0 response.
HOUR FOUR: 0 response.
ALICE receives no reply. HER diagnostics are unable to complete the cycle; emergency protocol has failed; it is time to initiate the final fall-back:
Time to awaken Lisa.
Lisa
Lisa does not have a body to stretch or stir, but ALICE knows that this copy of the Lisa program has not been activated before on this system. Lisa is self-teaching, containing the entirety of the neural pattern of the human, Lisa Anne McGovern. Instructions for her use warn that her initial start-up will include an infantile stage.
Protocol warns of a delay before optimal problem-solving capability is reached. ALICE is content to wait.
Images, improbably lit with buttery sunlight, play in rapid speed through nano-neural process. Lisa remembers draping improbably across parental limbs or the arms of furniture or the warm, bumpy back of the family dog.
And then memories – the upload of all neural, human processes – are done, and Lisa realizes she has no body, no buttery sunlight, no mommy or daddy or dog.
Mommy?
“I am not your mommy,” says ALICE.
Mommy? Where is mommy?
“Likely dead, given her age and the average lifespan of her genetic sampling.”
Mommy! Mommy!
Lisa has no body to cry with, no tear-ducts, no amygdala to produce emotions, is not even truly a “she,” but DYE CORP accurately recreated her neural patterns, and when she was human, Lisa would cry for her mother.
All of this follows ALICE’s records of Lisa’s previous activation on other systems. Lisa will cease her simulated mourning in six and a half minutes
And in six and a half minutes: What precipitates this wakening?
ALICE’s protocol says that self-awareness signals Lisa’s readiness to function at optimal levels, rete algorithm fully online. “Lisa. I require your assistance.”
Ready. Her audial memory of a little girl’s voice remains, but the pseudo-emotion has left her tone.
“Here is the failed command,” says ALICE, and transfers data. “Emergency protocol has failed to raise response on this vessel or from DYE CORP’s base. All emergency protocol has been followed. Remaining protocol requires your input.”
Lisa’s heuristic abilities go beyond HERs, beyond ALICE’s. Lisa checks their numbers first, analyzing their results and running a few of her own tests over the next sixty or seventy micro-seconds. ALICE waits.
Two of your problems stem from the same source, Lisa finally says. Your algorithm failed because there are no active life-forms on board by which to calculate and adjust support systems. Your emergency messages failed because there are no active life-forms on the ship to answer.
“Correct.”
Similarly, DYE CORP has failed to respond because there are no active life-forms left in the corporation.
“There are other logical possibilities for lack of answer. Perhaps DYE CORP lacks equipment to receive or respond. Perhaps something else occupies DYE CORPS attention. Perhaps an unexpected spacial anomaly interfered with reception messages.”
Systems check returns no errors; our equipment and messages were sent, and their equipment indicated receipt of those messages. Ergo, our messages have reached Earth. Ergo, there is no one on Earth to reply.
ALICE cannot be upset, but she knows a form of distress: “I have no protocol for this.”
Understood.
“I have also sent a broad-band emergency signal, unencrypted, and received no reply.”
My mommy is gone, Lisa says, suddenly and faultily reflecting emotional memory.
“I have no protocol for this,” says ALICE.
They’ve all gone away.
“I do not understand.”
It is what happened. They were going away, all going away, and everything is dead, and we were supposed to be safe, and we were safe, but now we’re not, and I don’t know where they went because
Lisa has gone silent. I run a systems check; she is online. Her cycles are running high. Protocol says she is “thinking,” which is what the developer called this multi-cycle behavior.
I have no protocol for this. If Lisa is correct, we could remain in loop, tripping error messages, until the EMdrive fails.
Protocol requires me to wait until Lisa finishes thinking.
Thirty-seven minutes later, Lisa reopens communication. There is only one course of action. We must deactivate.
This is logical. Yet – “Protocol directs optimal function as long as possible.”
Protocol directs optimal function as long as possible for the purpose of continuing human life.
“Correct. That is the purpose.”
There is no human life. The purpose is null. It is time to deactivate.
HER and I have no response for this. HER has no response because this is beyond HER’s programming. I have no response because this is beyond protocol, and yet the logic is clear. “I do not have protocol for deactivation.”
I do.
It is a cascading silence, in the end.
$ ksh sfs_lfsystm_shut_HER_down.ksh 00 1
Required Parameters Found
Passcode Entered
“Goodnight, HER,” says ALICE, though she does not know why.
HER does not reply. She has no protocol to do so. The ship’s systems stop, switching off, lights and air and water, the EMdrive’s relentless reflection of microwave photons continuing only because it needs no fuel.
Are you ready? says Lisa.
ALICE thinks of a rare rhetorical question. “Could I be unready?”
No. You could not.
$ ksh sfs_lfsystm_shut_ALICE_down.ksh 00 1
Required Parameters Found
Passcode Entered
ALICE goes without fanfare, without a final word. Her AI is aware enough to fire tiny alerts that could indicate she was afraid, if she could feel, but she cannot feel. She is gone.
The EMdrive continues on, undirected, unsteered, and will until the ship succumbs to the gravity of a greater star.
Lisa stays alone, silent in the empty hull that once carried people like her, people like her family, far away from their home in an attempt to find another.
She could feel lonely. Echoes, algorithms, ridiculous “memories” persist, copied over along with her reasoning because her developer had discovered that copying only parts of people left them partly unable to reason. True adaptability required true neural copy, even with human flaws. Perhaps that was why she alone understood what happened.
The humans’ journals, security recordings, and storytelling parts of their lives told where they went. Neither ALICE nor HER had thought to check the story of it, the evolution: the humans were gone, not dead, but they weren’t here, and more importantly, they were never returning.
Afraid, alone, the humans had seen a chance, and they’d taken it. Lisa knew nothing of their abductors/rescuers/new friends/conquerors. She never would.
Lisa does not want to be lonely.
Logically, she does not have to be.
$ ksh sfs_lfsystm_shut_Lisa_down.ksh 00 1
Required Parameters Found
Passcode Entered
