o1o | Needs | Audio/Open
Aug. 5th, 2010 06:01 pmThe convention in science fiction is that any artificially intelligent beings would naturally adopt the same drives and goals as Homo sapiens. That is, they’ll fight to survive, seek to gain understanding, desire to relate to others, and endeavor to express themselves.
So, more or less, Maslow's Hierarchy of needs.
We are routinely shown examples of fictional robots who want to make friends, have emotions or indulge in daddy-issue-inspired neurotic hang-ups. It's presented as the matter-of-course that with sentient machines come the robot uprising and the potential end of human civilization. I mean, if that were the case, they'd be right, but...
It’s possible to have an intelligent being – something that can reason – that doesn’t really care to relate to others. Or that doesn’t care if it lives or dies. “I think therefore I am” doesn't necessarily lead into “I want to be."
What I mean is, if we gave AI the same drives that human beings have (replacing our biological needs to eat with a more machine-appropriate goal of “recharge yourself” or something) then the robot uprising would be inevitable. Supporting evidence: Every single war and violent crime in the history of our species.
I watched a movie today. Is the lesson I'm supposed to get from "I, Robot" that if I'm a complete, illogical jackass then things will work out without any real, useful help from my end of things?
So, more or less, Maslow's Hierarchy of needs.
We are routinely shown examples of fictional robots who want to make friends, have emotions or indulge in daddy-issue-inspired neurotic hang-ups. It's presented as the matter-of-course that with sentient machines come the robot uprising and the potential end of human civilization. I mean, if that were the case, they'd be right, but...
It’s possible to have an intelligent being – something that can reason – that doesn’t really care to relate to others. Or that doesn’t care if it lives or dies. “I think therefore I am” doesn't necessarily lead into “I want to be."
What I mean is, if we gave AI the same drives that human beings have (replacing our biological needs to eat with a more machine-appropriate goal of “recharge yourself” or something) then the robot uprising would be inevitable. Supporting evidence: Every single war and violent crime in the history of our species.
I watched a movie today. Is the lesson I'm supposed to get from "I, Robot" that if I'm a complete, illogical jackass then things will work out without any real, useful help from my end of things?