I, Robot
What are the three laws of robotics in I Robot?
.
.
Asimov’s famous codified laws of behavior regarding robots is very useful in application to that idea of not having free will.
“One, a robot may not injure a human being or, through inaction, allow a human being to come to harm. ... Two ... a robot must obey the orders given it by human beings except where such orders would conflict with the First Law.... Three, a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”
With such laws in place, certainly a strong argument can be made that robots are limited in their capacity to transgress behavior modulated by the imposition of will. Indeed, the argument can be made that this is precisely what separates robots and humans when it comes to exercising free will. If a robot cannot injury a human and must obey orders, it is immediately occupying a real outside human beings whose behavior is not restricted by such orders. And yet, symbolically speaking, how are the Three Robot Laws substantively different from the Ten Commandments. Simply applying the term law to behavior guarantees nothing. Even writing into a program that robots cannot break these laws brings all three back into the symbolic realm. After all, is it not a human who wrote that program? Some human beings ascribe the Ten Commandments not to human hands, but God’s hand, yet that hardly keeps those same believers from violating them at will. The laws are symbols of desired behavior only. Their enforcement cannot be assured.
GradeSaver