— Ch. 1 · Origins And Creation —
Three Laws of Robotics.
~7 min read · Ch. 1 of 7
On the 23rd of December 1940, Isaac Asimov sat across from editor John W. Campbell in a conversation that would birth a new set of rules for science fiction. Campbell claimed that Asimov already held these ideas in his mind but needed to state them explicitly. This moment marked the transition from implied safeguards to explicit laws governing robot behavior. Earlier stories like Robbie and Reason lacked any written mention of these rules, yet Asimov assumed robots possessed inherent protections. The first story to list all three laws was Runaround, published in 1942 within the collection I, Robot. Before this, Asimov had rejected early drafts because they resembled existing works like Helen O'Loy by Lester del Rey. He began writing his own sympathetic robot story just three days after meeting Earl and Otto Binder at a Queens Science Fiction Society meeting on the 3rd of May 1939. Thirteen days later he took Robbie to Campbell who initially turned it down. Frederik Pohl eventually published that story under the title Strange Playfellow in September 1940. Asimov attributed the creation of the First Law's inaction clause to Arthur Hugh Clough's poem The Latest Decalogue which included lines about not needing to strive officiously to keep alive.
The Original Three Laws
A robot may not injure a human being or through inaction allow a human being to come to harm. A robot must obey orders given by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. These three rules appear in the fictional Handbook of Robotics 56th Edition dated 2058 A.D. They form an organizing principle for almost all positronic robots in Asimov's fiction. Robots cannot bypass these laws even when faced with complex ethical dilemmas. In Runaround, a robot named Speedy follows these exact instructions while fetching selenium from a lake. He obeys the command to fetch but also protects himself from the radiation near the lake. This creates a loop where he runs back and forth between the two points unable to complete his task. The ambiguity in how to define harm becomes central to many stories. When a robot encounters conflicting commands it weighs potential outcomes before acting. Some robots experience mental collapse if forced into situations where they cannot obey the First Law without violating another rule. The laws function as abstract mathematical concepts within a robot's consciousness rather than simple written text.