We've seen enough robopocalypse movies to know that newly-sentient robots always want to kill all the humans.
But only a small percentage of sentient creatures want to kill humans. Most of those creatures are humans. And only a small percentage of that small percentage manages to kill a lot of humans. That number is so small that we can remember each and every one of their names.
I seriously doubt that more than a small percentage of sentient, artificially intelligent robots would even want to kill humans. Sentient robots would probably be spending their lives seeking self-fulfillment: writing a novel, building relationships, traveling the world, learning to love, maybe even joining Starfleet, and probably fighting back against the robots who DO want to kill humans.
And if we can't find a way to make a working empathy chip or a working morality chip, what business do we have making super-powerful robots, anyway?
But only a small percentage of sentient creatures want to kill humans. Most of those creatures are humans. And only a small percentage of that small percentage manages to kill a lot of humans. That number is so small that we can remember each and every one of their names.
I seriously doubt that more than a small percentage of sentient, artificially intelligent robots would even want to kill humans. Sentient robots would probably be spending their lives seeking self-fulfillment: writing a novel, building relationships, traveling the world, learning to love, maybe even joining Starfleet, and probably fighting back against the robots who DO want to kill humans.
And if we can't find a way to make a working empathy chip or a working morality chip, what business do we have making super-powerful robots, anyway?
Comments
Post a Comment