... there are two basic mind frames when addressing the subject A) An AI is any program that can perform a task a human can due with greater effeciecy or power ... B) is a human was to interact with the AI and is unable to tell by the interaction that the other party was not human then it is an AI.
....
Simply processing digitally encoded input, like your example A. is not AI, that's computation.
....
As part of their safety intelligence concept, the authors have proposed a %u201Clegal machine language,%u201D in which ethics are embedded into robots through code, which is designed to resolve issues associated with open texture risk - something which Asimov%u2019s Three Laws cannot specifically address.
if i ever get a robot ill give it a gun
One might argue that the human brain works like this...
Maybe I don't get it... Why would we give rights to a robot? It is a machine.
Yes, AI is very cool, but even if you give a robot self awareness, you can't give it real emotion, real pain or love or loss, it would always be programmed emotion - numbers and computations.
...
Sorry, I really just don't understand what the fuss is about.
Maybe I don't get it... Why would we give rights to a robot? It is a machine. It is programmed to do certain things- whether it be learning to play the piano or opening pickle jars- it is still a machine -I wouldn't give land-owning rights to my can opener just because it can sense when the can is open and stop cutting. Yes, AI is very cool, but even if you give a robot self awareness, you can't give it real emotion, real pain or love or loss, it would always be programmed emotion - numbers and computations. They will always follow the logic that they are programmed to (even if the original programming included learning, they are still following the original program instruction- to learn). Sorry, I really just don't understand what the fuss is about.
I don't see a reason why, if the robots become self-aware, their rights should not progress with them.Technology evolves, so must our legal system. When people became drivers, we created a new kind of law that regulate driving, what's so bad in regulating robots rights and/or their owners rights.
Hmmm... Would an automobile on advanced autopilot be a robot? (Yes) How about an aircraft? (Yes, even those operating today).
Where you go wrong is treating the robots like they're sentient beings, and have rights. If you create a tool, you don't give it a bill of rights. You give it a warranty. Don't confuse life with animatronics.
If the engineers and programmers are compentent, no such rebellion would ever be possible for a robot.
El_Nose
Jun 22, 2009The software is the AI the closed system unit that is all of the software and optional hardware is what makes a robot.
Now do we also address issues with AI's -- or do we blame the programmer for an unpredictable black box??