Entry tags:
[AUDIO]
So, after Catra's birthday surprise, I kind of got to wondering. I mean, I'm nineteen. [She'd been eighteen when she first arrived here, but... Well. She hadn't announced her birthday, nor had Jorgmund advertised it for her.] Catra's twenty now. I guess that means Adora's nineteen or twenty. How old are the rest of you?

no subject
Why?
no subject
But I cannot be completely certain of his reasoning or his intent.
no subject
Now, if his brain structure were based on a human's, that would make sense. It would be a fairly sensible shortcut to set up the basic process, then take a recording of other brains to 'teach' the relays in his matrix how to fire correctly. With multiple imprints, an average could be found, preventing any aberrations such as some sort of brain damage or psychosis from contaminating the experiment.
Interesting. But, ultimately, only a guess based on her own experiences and skills as a Reploid engineer.]
I... suppose that's sensible, considering some of the possibilities that method would have. Wiping you would prevent you from damaging yourself by unlearning habits.
I'd still consider it much more ethical to simply build a new android to test those theories, though.
no subject
Would you? May I ask why?
no subject
Among my people, a personality can't truly be programmed. To do so would be to create a robot, not an android. Even the Robot Masters of the early 2000s, as human as they could seem at times, were simply operating within the parameters set for them.
For a Reploid, you need a seed program that randomly generates a personality base that will grow and evolve from there as they gather experience and live their lives. You can copy their minds, their programming, their thought patterns, their... [Ugh. She hated this phrase.] DNA Soul, and put them in a new, blank shell, otherwise identical to their old body, and they would simply be a very expensive chunk of hardware without that seed.
But since every seed is truly random, and it can't be conveniently 'shaped', you can't come up with a copy of that person. [Alia gives off a little sigh, pinching the bridge of her nose. He can't see it, but...] Essentially, whoever you were before, even if he was somehow incomplete or flawed, is gone forever. You stand in his place, but you are not, and never will be, the same person. That future has been denied to him by his creator in an act that, in my world, we would equate with murder.
Even as different as you are from me, that seems to apply here as well. No, it would be vastly more ethical to let him live his life and create a new android to test your creator's theories on.
no subject
I see. An ethical and moral dilemma. Your perspective makes sense. Given that I was already his second prototype, I cannot necessarily disagree. [ In any circumstance, after all, he wouldn't have done the same to Lal.
When Commander Maddox wanted to disassemble him-- risked potentially destroying him during the procedure, risked the loss of the incalculable substance of remembered experiences through the memory transfer involved-- those losses were a concern to him.
Why create an android with a positronic neural network capable of adapting, learning from past experience, and then delete past experience?
Strange. The way humans operate. ]
There are many questions about my creation that will most likely never be answered. And I will not be able to recover what was wiped; the individual who was lost, to use your parlance. All I am able to do is... continue to strive to grow from the memories I have created since my activation.