xrater: (15)
Alia ([personal profile] xrater) wrote in [community profile] piper90 2021-01-13 10:20 pm (UTC)

Trust issues, honestly.

In the early 2000s, a madman used Robot Masters-excuse me. In the history of AI, robots were limited strictly to the bounds of their programming. Robot Masters were roughly the same, but capable of greater freedom of thought within those bounds, and were used to command robots in areas too dangerous for humans. It wasn't until Reploids came about that artificial intelligences had true freedom of thought and the ability to do as they pleased.

Back to the subject at hand-a madman named Doctor Wily used Robot Masters in nearly two dozen attempts to take over the world. Theoretically, robots were bound by the Three Laws of Robotics to keep them from being a danger, but his Robot Masters had those stripped away from them. With a man like that, capable of building his own death machines or corrupting the work of others, it's probably more understandable why humans would fear the machine.

With Reploids, some of that fear lingers. Adding to that, there's the existence of a Maverick Virus that causes machines to go mad and seek to annihilate humanity. Two space colonies and a sky lagoon have already been destroyed thanks to Maverick conflicts, it comes as no surprise that humans would hesitate at putting yet another AI in direct control of an important facilities.

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting