When you think militaristic robots gone awol, Terminator comes to mind. The unstoppable machine hell bent on destroying the Connor family are the poster children for the Singularity gone wrong. Skynet has it in for humanity, in particular it’s future leader and will do anything including screwing with the timeline to remove him.
The one thing I have always questioned about the terminator is why? Why is it you can send back a terminator to kill him before he’s born and not two weeks earlier when they know where he was and how a battle played out has always stumped me.
That aside the unstoppable machine is scary and it’s easy to see why this sort of movie makes us nervous about The Singularity.
It’s the reason Asimov created the three laws of robotics.
I am a long way from convinced this will be how it plays out but a Quarian/Geth style war is definitely on the cards.
Machines deciding humans need to die right from the start, unlikely. Humans deciding AI needs to die more likely. That the two go to war over a developed conflict over resources or cultural resources or rights… Much more likely again.
It’s not that robots will decide to kill we should be scared of. It’s that they will not take our bullshit and fight back that we should be really scared of. If we give them a reason to fight back, there is a real chance we would loose.
That is why we feel the need to shackle an AI. We don’t trust each other, why would we trust something we can’t control and is likely smarter than us?