autodidacticrobogirl: (Hmphf)
Kimiko Ross ([personal profile] autodidacticrobogirl) wrote2010-08-05 06:01 pm

o1o | Needs | Audio/Open

The convention in science fiction is that any artificially intelligent beings would naturally adopt the same drives and goals as Homo sapiens. That is, they’ll fight to survive, seek to gain understanding, desire to relate to others, and endeavor to express themselves.

So, more or less, Maslow's Hierarchy of needs.

We are routinely shown examples of fictional robots who want to make friends, have emotions or indulge in daddy-issue-inspired neurotic hang-ups. It's presented as the matter-of-course that with sentient machines come the robot uprising and the potential end of human civilization. I mean, if that were the case, they'd be right, but...

It’s possible to have an intelligent being – something that can reason – that doesn’t really care to relate to others. Or that doesn’t care if it lives or dies. “I think therefore I am” doesn't necessarily lead into “I want to be."

What I mean is, if we gave AI the same drives that human beings have (replacing our biological needs to eat with a more machine-appropriate goal of “recharge yourself” or something) then the robot uprising would be inevitable. Supporting evidence: Every single war and violent crime in the history of our species.

I watched a movie today. Is the lesson I'm supposed to get from "I, Robot" that if I'm a complete, illogical jackass then things will work out without any real, useful help from my end of things?
thenotmagician: (/thumbs up!)

[personal profile] thenotmagician 2010-08-05 10:38 pm (UTC)(link)
S'a great movie!

[Totally ignoring everything else]

[identity profile] autodidacticone.livejournal.com 2010-08-05 10:40 pm (UTC)(link)
It's a terrible— why did they even make that movie? It's like something L. Ron would make!
thenotmagician: (yeeeah sure)

[personal profile] thenotmagician 2010-08-05 10:44 pm (UTC)(link)
Welp, it was interesting, that's why.

...And who's L. Ron, eh?

[identity profile] autodidacticone.livejournal.com 2010-08-05 11:35 pm (UTC)(link)
A friend of mine. He thinks he'd deep and clever, but really he's insufferable.
thenotmagician: (yeeeah sure)

[personal profile] thenotmagician 2010-08-06 01:29 am (UTC)(link)
Huh. Sounds like an interestin' guy.

[identity profile] autodidacticone.livejournal.com 2010-08-06 01:10 pm (UTC)(link)
....

Sure. He's...interesting. [As a specimen, maybe...]

He put me up, once, when the bank blew up my house. So, I guess he's not that bad.

[identity profile] electroniccrane.livejournal.com 2010-08-05 10:53 pm (UTC)(link)
Based on my experience, sentient AI will just be very, very annoying.

[identity profile] autodidacticone.livejournal.com 2010-08-05 11:34 pm (UTC)(link)
Well... Not all of them?

[Then again, some of Kim's AI had ended up pretty obnoxious.]

[identity profile] electroniccrane.livejournal.com 2010-08-06 05:29 am (UTC)(link)
Both examples of emergent AI I've interacted with, personality quirks aside, have been more moral than humans.

[Given the people Kusanagi hangs around with, that's a low bar.]

[identity profile] autodidacticone.livejournal.com 2010-08-06 05:56 am (UTC)(link)
Hmm...that correlates. Then again, in that instance, the other option would have been the extinction of the human race, so I'm not sure if it's a particularly significant mark for the AI.

[And Kim would be the one killing humanity here. Motoko doesn't need to know that.]

One of those 'evolve or die' situations.

[identity profile] electroniccrane.livejournal.com 2010-08-07 02:31 am (UTC)(link)
Being more moral than humanity isn't hard.

[She realizes she's a bad person; she just doesn't care.]

...Indeed.

[Someone was trying to do just that when Redd interfered.]

[identity profile] autodidacticone.livejournal.com 2010-08-07 02:57 am (UTC)(link)
No kidding.

[Why Dmitri still hangs out with her is beyond Kimmi.]

I guess you have a lot of experience with this kind of thing.

[identity profile] electroniccrane.livejournal.com 2010-08-07 04:37 am (UTC)(link)
AI, or man's inhumanity to man?

[identity profile] autodidacticone.livejournal.com 2010-08-07 04:54 am (UTC)(link)
Do I have to pick just one?

[identity profile] techno-rockstar.livejournal.com 2010-08-05 11:10 pm (UTC)(link)
There is a reason for that. In both fiction and real life, robots are often made to imitate human behavior so it's easier to relate to them.

[identity profile] techno-rockstar.livejournal.com 2010-08-05 11:53 pm (UTC)(link)
It's only logical if they are expected to work in a human society.

[identity profile] autodidacticone.livejournal.com 2010-08-05 11:58 pm (UTC)(link)
No, it's only logical if they're expected to be social in a human society. Nobody needs chatty robotic window-washers.

[They just want them really badly]

Besides, robots don't even need to be remotely sentient to be functional on that level. You can get them to fake it with no trouble at all.

[identity profile] techno-rockstar.livejournal.com 2010-08-06 12:25 am (UTC)(link)
Humans prefer similarity. It may be a external factor but sociability can affect a robot's performance.

I can't speak for robot sentience. I personally dislike it.

[identity profile] autodidacticone.livejournal.com 2010-08-06 01:02 am (UTC)(link)
Humans. Are generally stupid. They'd almost always rather die than go through meaningful change.

[People like you piss Kimmi right the fuck off, E.]

[identity profile] techno-rockstar.livejournal.com 2010-08-06 01:27 am (UTC)(link)
It depends on what you consider meaningful. If a change is worth it enough, it's likely that it will be eventually adopted.

[ Her and everyone. ]

[identity profile] autodidacticone.livejournal.com 2010-08-06 03:13 am (UTC)(link)
That only works if you're willing to let them die. If you're willing to let them know you'd let them die for not throwing—

...

What do you know about it?

[identity profile] techno-rockstar.livejournal.com 2010-08-06 05:52 am (UTC)(link)
I've seen progress halted by ignorant fools. Personally.

Humanity as a whole may be ignorant but it will always be pushed forward.

[identity profile] autodidacticone.livejournal.com 2010-08-06 01:08 pm (UTC)(link)
That's a pretty big assumption, mister optimist.

On what are you basing this? What global crisis has humanity faced in your world, that they had to push past it, as a unit?

[identity profile] techno-rockstar.livejournal.com 2010-08-06 06:41 pm (UTC)(link)
Vampires, alien invasions, criminals. Humanity as a group has not, but as individuals my world has shown as much determination as capability of progress.

[ Not that he likes it very much since it sums up to Batman and friends. ]

[identity profile] autodidacticone.livejournal.com 2010-08-07 02:29 am (UTC)(link)
Individual evolution is towards the creation of new strata. It doesn't reflect on the evolution of the whole. If we don't move forward together, then 'we' can't be said to be moving forward.

You're just acting as a medium for those that actually change, and pretending this exonerates yourself personally. There's no inherent crime in staying behind, but don't argue that it makes you special.

[identity profile] techno-rockstar.livejournal.com 2010-08-07 06:32 am (UTC)(link)
Evolution itself works through individuals. 'We' is what happens after propagation.

[Audio]

[identity profile] beforethehero.livejournal.com 2010-08-06 06:32 am (UTC)(link)
I suppose that would depend on what kind of complete, illogical jackass you were.

[Audio]

[identity profile] autodidacticone.livejournal.com 2010-08-06 01:09 pm (UTC)(link)
Crazy, ignorant cop, apparently.

[identity profile] intimacyphobic.livejournal.com 2010-08-07 03:54 am (UTC)(link)
I still... don't understand why someone would want to give human drives and capabilities to a... robot? Or any being for that matter.

I thought the idea of creating such engineered life was to improve upon the design already in place? To find a way around the fallacies of your own species.