Nemo Posted October 1, 2009 Report Share Posted October 1, 2009 holy crap the nerdism is off the charts here. I seriously did not understand any thing after post 10 Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 HAL could've been programmed by Asimov's law and still be a megalomaniac murdering psycopath (or was that from Lost in Space) I don't see how. The laws would ensure that that doesn't happen right? What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm. HAL didn't even come close to doing that. The bastard had his own agenda! Link to comment Share on other sites More sharing options...
MarketTantrik Posted October 1, 2009 Report Share Posted October 1, 2009 I don't see how. The laws would ensure that that doesn't happen right? What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm. In "I, Robot" the AI decided that mankind needed to be subjugated for its own good, in accordance with its hardcoded laws for the preservation of human life. Link to comment Share on other sites More sharing options...
Bulovski Posted October 1, 2009 Report Share Posted October 1, 2009 I don't see how. The laws would ensure that that doesn't happen right? What the entity would ideally do is evaluate risks and outcomes and pick whichever causes least harm. HAL didn't even come close to doing that. The bastard had his own agenda! replace humanity with 'universe' and you have 'megalomania' this has gone way beyond OT - I rike it Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 @MT: I'm thinking law zero would make sure those instances don't happen. Asimov was writing under technological constraints back then. We've got enough to go on now to make sure those safety nets are built in, and work. I'm going to cause Judgment Day aren't I? Link to comment Share on other sites More sharing options...
MarketTantrik Posted October 1, 2009 Report Share Posted October 1, 2009 @MT: I'm thinking law zero would make sure those instances don't happen. Asimov was writing under technological constraints back then. We've got enough to go on now to make sure those safety nets are built in, and work. Law Zero would cause an infinity loop fatalilty. Not workable. Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 replace humanity with 'universe' and you have 'megalomania' Megalomania is an emotional/mental condition/delusion. It doesn't factor into AI, or doesn't have to if you're doing it right. Link to comment Share on other sites More sharing options...
achilles Posted October 1, 2009 Report Share Posted October 1, 2009 Law Zero would cause an infinity loop fatalilty. Not workable. can b. if a planet can 'talk', law z is easy. Link to comment Share on other sites More sharing options...
Tyler Posted October 1, 2009 Report Share Posted October 1, 2009 Thread is WIN! Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 can b. if a planet can 'talk', law z is easy. Unicron? Link to comment Share on other sites More sharing options...
Gautam Posted October 1, 2009 Report Share Posted October 1, 2009 Unicron? Sometimes oranges are green. But they're still called oranges This is one of those OT nonsense threads right? Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 No, Unicron really was a talking planet Link to comment Share on other sites More sharing options...
Bulovski Posted October 1, 2009 Report Share Posted October 1, 2009 Law Zero would cause an infinity loop fatalilty. Not workable. explain Megalomania is an emotional/mental condition/delusion. It doesn't factor into AI, or doesn't have to if you're doing it right. a being smart enough to build 'right' into AI would not need to build the AI in the first space. Two great examples - The End of Eternity and Mass Effect (Quarians and Geth) time to change thread title Link to comment Share on other sites More sharing options...
Gautam Posted October 1, 2009 Report Share Posted October 1, 2009 No, Unicron really was a talking planet But some crabs eat their own poo... Link to comment Share on other sites More sharing options...
abhi90 Posted October 1, 2009 Report Share Posted October 1, 2009 No, Unicron really was a talking planet big transforming mofo evil planet......yeah....... Link to comment Share on other sites More sharing options...
supersim Posted October 1, 2009 Report Share Posted October 1, 2009 somebody shift this thread to gen chat ! free posts increment for all Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 But some crabs eat their own poo... They might, but I'm going to refer you to abhnit saar below: big transforming mofo evil planet......yeah....... You forgot scary. Good boy Link to comment Share on other sites More sharing options...
Bulovski Posted October 1, 2009 Report Share Posted October 1, 2009 Link to comment Share on other sites More sharing options...
Keyofx Posted October 1, 2009 Report Share Posted October 1, 2009 explaina being smart enough to build 'right' into AI would not need to build the AI in the first space. Two great examples - The End of Eternity and Mass Effect (Quarians and Geth) I think he means that a decision making loop would inevitably result in a system fatality because the factors that affect that decision are practically endless. Good point on rightness, but artificial governing constructs like HAL aren't/won't be built as replacements or equals. Asimov worked that in, too. Subordinates! Link to comment Share on other sites More sharing options...
Nemo Posted October 1, 2009 Report Share Posted October 1, 2009 Sometimes oranges are green. But they're still called oranges what when apples are red ?? the are still called apples ohh, and by the way Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.