Jump to content

Jack Thompson sues Facebook


Recommended Posts

HAL could've been programmed by Asimov's law and still be a megalomaniac murdering psycopath (or was that from Lost in Space)

I don't see how. The laws would ensure that that doesn't happen right?

 

What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm. HAL didn't even come close to doing that. The bastard had his own agenda!

Link to comment
Share on other sites

I don't see how. The laws would ensure that that doesn't happen right?

 

What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm.

In "I, Robot" the AI decided that mankind needed to be subjugated for its own good, in accordance with its hardcoded laws for the preservation of human life.

Link to comment
Share on other sites

I don't see how. The laws would ensure that that doesn't happen right?

 

What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm. HAL didn't even come close to doing that. The bastard had his own agenda!

replace humanity with 'universe' and you have 'megalomania'

 

this has gone way beyond OT - I rike it

Link to comment
Share on other sites

@MT: I'm thinking law zero would make sure those instances don't happen. Asimov was writing under technological constraints back then. We've got enough to go on now to make sure those safety nets are built in, and work.

 

 

 

I'm going to cause Judgment Day aren't I?

Link to comment
Share on other sites

@MT: I'm thinking law zero would make sure those instances don't happen. Asimov was writing under technological constraints back then. We've got enough to go on now to make sure those safety nets are built in, and work.

Law Zero would cause an infinity loop fatalilty. Not workable.

Link to comment
Share on other sites

replace humanity with 'universe' and you have 'megalomania'

Megalomania is an emotional/mental condition/delusion. It doesn't factor into AI, or doesn't have to if you're doing it right.

Link to comment
Share on other sites

Law Zero would cause an infinity loop fatalilty. Not workable.

explain

 

Megalomania is an emotional/mental condition/delusion. It doesn't factor into AI, or doesn't have to if you're doing it right.

a being smart enough to build 'right' into AI would not need to build the AI in the first space. Two great examples - The End of Eternity and Mass Effect (Quarians and Geth)

 

time to change thread title :ranting:

Link to comment
Share on other sites

But some crabs eat their own poo...

They might, but I'm going to refer you to abhnit saar below:

 

big transforming mofo evil planet......yeah.......

You forgot scary. Good boy :ranting:

Link to comment
Share on other sites

explain

a being smart enough to build 'right' into AI would not need to build the AI in the first space. Two great examples - The End of Eternity and Mass Effect (Quarians and Geth)

I think he means that a decision making loop would inevitably result in a system fatality because the factors that affect that decision are practically endless.

 

Good point on rightness, but artificial governing constructs like HAL aren't/won't be built as replacements or equals. Asimov worked that in, too. Subordinates!

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...