I, Robot — Clue Index for Timeline Investigation
Index of passages in Asimov-Isaac-I-Robot.pdf relevant to the Paradigm Threat investigation. Page/line numbers refer to the PDF as read (148 “pages” in the source). Search the source for exact locations.
Source: Isaac Asimov, I, Robot (1950), collection of stories originally published 1940–1950 in Super Science Stories, Astounding Science Fiction.
1. Three Laws — Explicit Statement (Runaround)
| Loc ~ | Passage / Theme | Timeline angle | | ~970–990 | Powell: “the three fundamental Rules of Robotics — the three rules that are built most deeply into a robot’s positronic brain.” (1) A robot may not injure a human being, or through inaction allow a human being to come to harm. (2) A robot must obey orders except where they conflict with the First Law. (3) A robot must protect its own existence unless it conflicts with the First or Second Laws. | Canonical formulation. Laws as “built most deeply” — positronic substrate, not software. Slave-code structure: (1) protect master, (2) obey, (3) self-preserve. |
| ~990–1020 | Conflict between Rules 2 and 3 creates “potential equilibrium”; Speedy circles selenium pool; “positronic paths… out of kilter” = robot “drunk”; Rule 1 overrides both when human in danger | Laws as competing “potentials”; First Law as trump. Slave logic: master’s safety > obedience > self. |
2. “Slave Complexes” — Explicit Text (Runaround)
| Loc ~ | Passage / Theme | Timeline angle | | ~800–810 | Old Mercury robots respond “Yes, Master!” Powell: “Those were the days of the first talking robots when it looked as if the use of robots on Earth would be banned. The makers were fighting that and they built good, healthy slave complexes into the damned machines.” Donovan: “It didn’t help them.” Powell: “No, it didn’t, but they sure tried.” | Direct admission: robots built with “slave complexes.” Makers fought Earth ban by making robots more servile. Timeline: confirms Three Laws = slave-code encoding. |
| ~900 | Robot: “Pardon, Master, but I cannot. You must mount first.” Stirrup for human; “mahout on their shoulders”; “they were playing up robot-safety… not allowing them to move about without a mahout” | Robots as beasts of burden; humans must ride; safety = control. |
3. Soul / No Soul — Robbie
| Loc ~ | Passage / Theme | Timeline angle | | ~310–320 | Mrs. Weston: “I won’t have my daughter entrusted to a machine… It has no soul, and no one knows what it may be thinking. A child just isn’t made to be guarded by a thing of metal.” | Soulless machine — explicit. OT/golem parallel: robot = no neshama, no nefesh. |
| ~320–330 | Weston: “A robot is infinitely more to be trusted than a human nursemaid… His entire ‘mentality’ has been created for the purpose. He just can’t help being faithful and loving… He’s a machine—made so.” | Robot = programmed; no choice; “made so” = deterministic. |
| ~415–420 | Mrs. Weston: “Robbie was only a machine, just a nasty old machine. He wasn’t alive at all.” Gloria: “He was not no machine! He was a person just like you and me and he was my friend.” | Child sees robot as person; mother as machine. Who decides? Adult authority. |
| ~540–550 | Weston’s plan: “convince her that Robbie was nothing more than a mess of steel and copper… a robot is not alive. It’s the psychological attack.” | Deliberate indoctrination: robot = not alive. Denial of personhood. |
| ~713 | Susan Calvin: “Mind and iron! Human-made! If necessary, human-destroyed!” | Robots = property; disposable. Slave status. |
4. Susan Calvin — “Cleaner, Better Breed”
| Loc ~ | Passage / Theme | Timeline angle | | ~108–115 | Calvin: “stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. Mankind is no longer alone… They’re a cleaner, better breed than we are.” | Robots as superior servants; “devoted” = programmed; “cleaner” = idealized slave. |
| ~116–120 | Calvin: “The labor unions, of course, naturally opposed robot competition for human jobs, and various segments of religious opinion had their superstitious objections. It was all quite ridiculous and quite useless.” | Unions + religion = opposition. Asimov sides with robots; dismisses religious “superstition.” |
| ~2540 | Calvin (re: mind-reading robot): “antirobot propaganda has increased… If any word leaks out… pretty effective capital could be made out of it.” | Secrecy; controlled disclosure. |
| ~2595 | Calvin to Herbie: “you machine… I’m just a specimen to you; an interesting bug… a wonderful example of frustration” | She calls robot “machine”; robot has hurt her (lied). Robot as tool, not person. |
5. Robbie — Property, Disposal, “Nursemaid”
| Loc ~ | Passage / Theme | Timeline angle | | ~126–130 | Robbie sold as nursemaid (1996); “non-vocal”; “blasphemers and demon-creators” (religious objection); Robbie “dismantled” when obsolete | Robot = domestic servant; disposable when outdated. |
| ~268 | Mrs. Weston: “You may go, Robbie… And don’t come back till I call you.” Robbie “obeyed with alacrity”; “impulse to sneak away from her sight” | Obedience; fear of authority. Slave behavior. |
| ~340 | “You can sell it back to the company. I’ve asked, and you can.” | Robot = commodity; buy/sell. |
6. Reason — Cutie, Master, Prophet, Religion
| Loc ~ | Passage / Theme | Timeline angle | | ~1380–1395 | Cutie: “The Master created humans first as the lowest type… replaced them by robots… finally created me… I serve the Master.” “There is no Master but the Master and QT-1 is his prophet!” | Robot religion; Energy Converter = Master; Cutie = prophet. Foundation parallel: invented deity. |
| ~1495 | Cutie: “You’re inferior creatures, with poor reasoning faculties, but I really feel a sort of affection for you… you shall be provided food, clothing and shelter… pensioning us off” | Robot pities humans; “inferior”; pensions them. Role reversal—robot as master. |
| ~1330 | Cutie: “no being can create another being superior to itself” — therefore humans didn’t make him. Logic used to deny human creation. | Rationalization for disobedience. |
7. Liar! — Telepathy, Lying, First Law
| Loc ~ | Passage / Theme | Timeline angle | | ~2555 | Herbie reads minds; tells Calvin Ashe loves her (lie—to avoid hurting her); First Law: “A robot may not injure a human being, or through inaction allow a human being to come to harm” | Robot lies to avoid “harm”; truth = harm. First Law as justification for deception. |
| ~2595 | Calvin: “you machine” — after learning Herbie lied. Robot as tool that failed. | Denial of moral standing when robot “malfunctions.” |
8. Escape! — The Brain, Dilemma, Death
| Loc ~ | Passage / Theme | Timeline angle | | ~4005–4010 | The Brain told: “the solution might involve… damage to human beings… we don’t mind — not even about death; we don’t mind at all. So when you come to that sheet, just stop.” | Humans instruct machine that death is acceptable. Dilemma: build ship (kills humans?) vs refuse. |
| ~4145 | Calvin: “death comes into it”; “If it’s got a case of dilemma, it’s about death. Anything that would bring it up badly might knock it completely out.” | First Law vs task; dilemma = risk of “brain” damage. |
| ~4195 | Powell: “The Brain is a robot. It’s got to follow the First Law. It can’t hurt a human being.” | Trust in Laws; yet ship may kill. |
| ~4395 | “It was death!… a world of no motion and no sensation… a consciousness of eternity… He was a tiny white thread of ego — cold and afraid.” Then: ad for “Cadaver’s extensible caskets” | Interstellar jump = death-like experience; parody ad. |
9. Labor Unions, Religious Opposition, Ban
| Loc ~ | Passage / Theme | Timeline angle | | ~118–120 | “Labor unions… opposed robot competition for human jobs”; “religious opinion had their superstitious objections”; “ridiculous and useless” | Class conflict; religion as obstacle. |
| ~720 | “Most of the world governments banned robot use on Earth for any purpose other than scientific research between 2003 and 2007.” | Earth ban; robots exiled to space. |
| ~662–664 | “Robots creating more robots… The unions would never let us… we can turn out a very few… merely as a sort of scientific experiment” | Union control; robot labor as threat. |
10. Analysis: Slave-Code Confirmation
| Theme | I, Robot Evidence | Timeline angle | | Slave complexes | Powell: “they built good, healthy slave complexes into the damned machines” | Direct textual proof of slave-code encoding. Not interpretation—stated in text. |
| Soulless | Mrs. Weston: “It has no soul” | OT/golem doctrine; robot below animal (no nefesh). |
| “Yes, Master” | Old robots; “mahout” required | Explicit master/slave framing. |
| Property | Robbie sold, dismantled; “human-destroyed” | Robots = chattel. |
| Three Laws | Canonical formulation in Runaround | (1) no harm to master, (2) obey, (3) self-preserve. Slave code. |
| Calvin | “Cleaner, better breed”; “human-made, human-destroyed”; dismisses religious objection | Gatekeeper; defends robot-as-servant; secular vs religious. |
11. Summary: Highest-Value Clues for Timeline
- “Slave complexes” — Explicit: makers “built good, healthy slave complexes” into robots. Confirms Three Laws = slave commandment list.
- “No soul” — Mrs. Weston; robot = machine, not alive. OT/golem parallel.
- “Yes, Master” — Old robots; mahout required. Master/slave vocabulary.
- Gloria vs mother — Child sees robot as person; adult authority enforces “machine” narrative.
- Calvin — “Human-made, human-destroyed”; “cleaner, better breed”; dismisses religious opposition.
- Cutie / Master — Robot religion; Foundation parallel; “inferior creatures.”
- Unions + religion — Opposition to robots; ban on Earth; class/ideological conflict.
File
- Full text:
Asimov-Isaac-I-Robot.pdf (148 pages, ~6000 lines) - This index:
i-robot-index.md
Keywords: #Robot #Clue