Kimiko Ross (
autodidacticrobogirl) wrote2010-08-05 06:01 pm
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Entry tags:
o1o | Needs | Audio/Open
The convention in science fiction is that any artificially intelligent beings would naturally adopt the same drives and goals as Homo sapiens. That is, they’ll fight to survive, seek to gain understanding, desire to relate to others, and endeavor to express themselves.
So, more or less, Maslow's Hierarchy of needs.
We are routinely shown examples of fictional robots who want to make friends, have emotions or indulge in daddy-issue-inspired neurotic hang-ups. It's presented as the matter-of-course that with sentient machines come the robot uprising and the potential end of human civilization. I mean, if that were the case, they'd be right, but...
It’s possible to have an intelligent being – something that can reason – that doesn’t really care to relate to others. Or that doesn’t care if it lives or dies. “I think therefore I am” doesn't necessarily lead into “I want to be."
What I mean is, if we gave AI the same drives that human beings have (replacing our biological needs to eat with a more machine-appropriate goal of “recharge yourself” or something) then the robot uprising would be inevitable. Supporting evidence: Every single war and violent crime in the history of our species.
I watched a movie today. Is the lesson I'm supposed to get from "I, Robot" that if I'm a complete, illogical jackass then things will work out without any real, useful help from my end of things?
So, more or less, Maslow's Hierarchy of needs.
We are routinely shown examples of fictional robots who want to make friends, have emotions or indulge in daddy-issue-inspired neurotic hang-ups. It's presented as the matter-of-course that with sentient machines come the robot uprising and the potential end of human civilization. I mean, if that were the case, they'd be right, but...
It’s possible to have an intelligent being – something that can reason – that doesn’t really care to relate to others. Or that doesn’t care if it lives or dies. “I think therefore I am” doesn't necessarily lead into “I want to be."
What I mean is, if we gave AI the same drives that human beings have (replacing our biological needs to eat with a more machine-appropriate goal of “recharge yourself” or something) then the robot uprising would be inevitable. Supporting evidence: Every single war and violent crime in the history of our species.
I watched a movie today. Is the lesson I'm supposed to get from "I, Robot" that if I'm a complete, illogical jackass then things will work out without any real, useful help from my end of things?
no subject
[Totally ignoring everything else]
no subject
no subject
...And who's L. Ron, eh?
no subject
no subject
no subject
Sure. He's...interesting. [As a specimen, maybe...]
He put me up, once, when the bank blew up my house. So, I guess he's not that bad.
no subject
no subject
[Then again, some of Kim's AI had ended up pretty obnoxious.]
no subject
[Given the people Kusanagi hangs around with, that's a low bar.]
no subject
[And Kim would be the one killing humanity here. Motoko doesn't need to know that.]
One of those 'evolve or die' situations.
no subject
[She realizes she's a bad person; she just doesn't care.]
...Indeed.
[Someone was trying to do just that when Redd interfered.]
no subject
[Why Dmitri still hangs out with her is beyond Kimmi.]
I guess you have a lot of experience with this kind of thing.
no subject
no subject
no subject
no subject
no subject
no subject
[They just want them really badly]
Besides, robots don't even need to be remotely sentient to be functional on that level. You can get them to fake it with no trouble at all.
no subject
I can't speak for robot sentience. I personally dislike it.
no subject
[People like you piss Kimmi right the fuck off, E.]
no subject
[ Her and everyone. ]
no subject
...
What do you know about it?
no subject
Humanity as a whole may be ignorant but it will always be pushed forward.
no subject
On what are you basing this? What global crisis has humanity faced in your world, that they had to push past it, as a unit?
no subject
[ Not that he likes it very much since it sums up to Batman and friends. ]
no subject
You're just acting as a medium for those that actually change, and pretending this exonerates yourself personally. There's no inherent crime in staying behind, but don't argue that it makes you special.
no subject
[Audio]
[Audio]
no subject
I thought the idea of creating such engineered life was to improve upon the design already in place? To find a way around the fallacies of your own species.