Thoughts on the Jeri - S.A.C. episode 3 (Android and I)

Talk about GitS:SAC, 2nd Gig, & SSS here!

Moderator: sonic

Post Reply
TonyRedwood
Posts: 3
Joined: Sun Mar 03, 2013 3:50 pm

Thoughts on the Jeri - S.A.C. episode 3 (Android and I)

Post by TonyRedwood »

Hello there, first time here.



SPOILERS for the whole S.A.C. series ahead (also big wall of text, sorry =P).



Recently I've re-watched episode 3 of S.A.C., the one with the Jeri android mass "suicides", and wondered about the nature of the ambassador's son Marshall McLachlan's Jeri (let's call it "the Jeri").

The episode seems to hint at the possibility that the Jeri had developed a ghost of its own, due to the last sentence it spoke to Marshall at the end of the episode. Togusa learned, in fact, that the sentence ("I'm sorry. I really did love you.") was not in the same movie where the rest of the Jeri's dialogue was copied from. Also, the fact that the Jeri seemingly restrained Marshall against his will seems to further confirm the existence of a ghost.

Now, my views on this subject are divided. On one hand, I am somewhat skeptical about the possibility of a "conscience emerging from a sea of information", at least within the next few decades, which is when GITS happens anyway. On the other hand, I'm willing to suspend my disbelief for the sake of appreciating the show (as I did on the subject of the Puppetmaster in non-S.A.C. universes), and so I concede the possibility that the Jeri simply "became human".

But to end the story here is no fun. It should be possible to find an explanation for the Jeri behavior that agrees with my skeptical side. Or maybe I'm just grasping at straws, I don't know.

Anyway, a theory of mine is that Marshall, being an expert programmer (given the fact that he developed the virus that affected the Jeri line, as well as the "security fix" for his own Jeri), decided to "teach" the Jeri some dialogues from an old movie he was fond of. The show already hints at this, but I think he went further and actually presented other movies to the Jeri, making her learn not only words, but actions as well.

In a desperate attempt to make his Jeri behave more humanly, he started to apply more complex algorithms in his programming, and eventually, as it is human nature, introduced a bug somewhere amidst the logic (hey, it's his fault for not making his program open-source! ;)). This bug then manifested itself at the end of the episode, thereby explaining the seemingly out of context sentence and the restraining from the Jeri.

You may then reply with:
- "But the restraining was not out of context, the Jeri certainly did it to save him from doing something stupid like getting shot while resisting arrest!"
or
- "That sentence was not out of context at all, the Jeri was actually saying that she/it loved him after all, despite what she/it had been forced to say from the movie dialogue!"

Okay, maybe instead of a bug, what Marshall created was actually an unintended feature. He may have created such a complex program that not even he could predict its behavior, and so the Jeri's restraining and extra dialogue were simply the result of it mixing other actions/words from different movies.

As a final possibility, one that I find harder to believe, is that perhaps he really did have a stroke of genius, and successfully created an human-like AI, just as Dr. Asuda did, the father of the Tachikoma. However, in Dr. Asuda's case, what he developed was a special kind of neurochip which was the foundation of the Tachikoma AIs, and so we're not just talking about software anymore (I would love to talk more about the matter of neurochips alone, since it's a subject I find fascinating, but maybe some other time, in another post).

Anyway, what are your thoughts on this?
User avatar
Freitag
Posts: 626
Joined: Mon Sep 01, 2008 3:43 pm
Location: Behind you

Post by Freitag »

Welcome!

There is another possibility you left off. Assuming that the virtual personalities are driven to respond to user input so that they become customized to each users tastes they probably use some kind of neural net programming.

A friend of mine did some of that back in college and his was trained to recognize images. He would feed it a series of images to train it and then he would end it a noisy image (with data errors introduced) and ask it what image it "thought" it saw. The way it was written, it would ALWAYS have an answer for the question (it also output a certainty statistic that was nearly always high until the point it was essentially just guessing.). One time I asked when would happen if his error rate exceeded 50%. .

So think of a stream of binary data. When you add error, you flip some of the 1s into 0s or the reverse. When you exceed 50% data error you actually start approaching a negative value until at 100% error you'd have in effect a photo negative - which actually is quite recognizable (at least to humans). The worst error rate is 50%.

Anyway, when he got higher than 50% error he got a completely unexpected result from his software. It began to emit a photo-negative result with a high degree of confidence. (60% error had the same degree of confidence as 40% error) and in fact it was perfectly symmetric statistically. So the answer was very clear and the software was very sure of itself, but the outcome was something for which it had never been trained to expect.

Now his was a very simplistic image matching routine, in GitS they have much more advanced logic floating around in their machines, but without having a bug and without having a Ghost, software can exhibit a new behavior.

So for your skeptical approach, there is another choice.


Now as the GitS universe was created, real AIs are possible (machines can have ghosts to use the vernacular) and so since he gave his Jeri more actual attention and treated it less like an appliance, maybe there was an emergent intelligence. Or maybe like Johnny 5, while he was poking around in her CPU he gave her an electric shock that actually created a good error instead of the typical fried chips and gave her intelligence.

I took it at face value that the device had become self-aware.
People tend to look at you a little strangely when they know you stuff voodoo dolls full of Ex-Lax.
User avatar
THYREN
Posts: 178
Joined: Thu Feb 09, 2006 1:08 am
Location: Vancouver, BC, Canada

Post by THYREN »

Welcome!

Interesting discussion! I didn't watch that episode in a very long time so I'm glad you posted this because that "forced" me to watch it again ^^

I think the Jeri became self-aware (not sure when during their "relationship"), which is why she was able to add her own thoughts at the end of the dialog. She clearly says that she doesn't love him anymore, and I wonder if it's because of the virus he created: she probably felt that killing all the other Jerrys was like killing her too.

I felt this episode was probably a way to introduce the cognitive evolution that will happen to the Tachilomas in the subsequent episodes...

What I don't understand is why Marshall said those dialogs to her (when she's restraining him). Could he have been infected by his own virus himself? :shock:
Why drink and drive when you can smoke and fly ?

Image
TonyRedwood
Posts: 3
Joined: Sun Mar 03, 2013 3:50 pm

Post by TonyRedwood »

Thank you all for the welcomes! ;)
Freitag wrote: Now his was a very simplistic image matching routine, in GitS they have much more advanced logic floating around in their machines, but without having a bug and without having a Ghost, software can exhibit a new behavior.

So for your skeptical approach, there is another choice.
That is a very interesting possibility, the use of neural networks in order to allow AIs to learn from user input. However, to be fair, I mentioned before that Marshall may have created an unintended feature instead of a bug, something he could not predict in advance because of the overly complex algorithm he wrote.
THYREN wrote: I think the Jeri became self-aware (not sure when during their "relationship"), which is why she was able to add her own thoughts at the end of the dialog. She clearly says that she doesn't love him anymore, and I wonder if it's because of the virus he created: she probably felt that killing all the other Jerrys was like killing her too.
Concerning spoken words by the Jeri, she says she doesn't love him anymore because that's what is said in the movie as well. This means that if the Jeri became self-aware, it happened in the sentence she said afterwards: "I'm sorry. I really did love you.", which was not present in the movie.

To be exact, the original Japanese version of the episode contains the word "anymore", both in the Jeri dialogue and the movie. In the English version, however, the word "anymore" is not said at all in both situations. This does not change my argument since the Jeri still mimics the movie at that point.

Of course, this doesn't say anything about the Jeri's actions, which differed from the expected the moment she restrained Marshall.
THYREN wrote: I felt this episode was probably a way to introduce the cognitive evolution that will happen to the Tachilomas in the subsequent episodes...
Yes, I had that feeling as well, although I do not agree that the Tachikomas evolved in the same way as the Jeri, as I said before.
THYREN wrote: What I don't understand is why Marshall said those dialogs to her (when she's restraining him). Could he have been infected by his own virus himself? :shock:
The way I see it, Marshall wanted to fantasize about the possibility that his Jeri was unique and alive. To enact his fantasy, he decided to pretend he was having a conversation with the Jeri by mimicking the movie dialogue (and making the Jeri follow along). The moment she retrained him, though, I believe he actually spoke a sentence that matched the movie without realizing it: "Are you crazy?", which may have triggered the respective response by the Jeri.
Post Reply