newsletter banner

ISSUE 20.15.F • 2023-04-10 • Text Alerts!Gift Certificates
You’re reading the FREE newsletter

Susan Bradley

You’ll immediately gain access to the longer, better version of the newsletter if you make a donation and become a Plus Member. You’ll receive all the articles shown in the table of contents below, plus access to all our premium content for the next 12 months. And you’ll have access to our complete newsletter archive!

Upgrade to Plus membership today and enjoy all the Plus benefits!

In this issue

COMMENTARY: The Three Laws of Robotics

FROM THE FORUMS: Ten top forum topics — Support

Additional articles in the PLUS issue • Get Plus!

LEGAL BRIEF: Your call is very important — to you

FREEWARE SPOTLIGHT: Opal — Now I need a nap

PATCH WATCH: How do you install and patch your new computer?


ADVERTISEMENT
1Password

Pricing for teams & businesses | 1Password

Review our team pricing and sign up for a Free Trial to get access to password manager, digital vault, password generator, digital wallet, and more.


COMMENTARY

The Three Laws of Robotics

Will Fastie

By Will Fastie

Along with its recent announcement of Copilot, Microsoft made a point of mentioning “responsible AI.”

Undoubtedly, part of the reason for bringing the matter up was the almost instant controversy surrounding Bing AI, Microsoft’s integration of its AI engine into Bing and Edge, especially its apparently threatening behavior toward a reporter.

What does “responsible AI” mean?

We can’t tell for sure what Microsoft means by the phrase. Speculation is often a pointless exercise, but Bing AI’s behavior raises very serious questions. And now, we face the same challenge in Microsoft 365’s apps, because Copilot is now built into them all.

Popular culture trumps academia

The matter of intelligent behavior by machines is not a new discussion. However, before 1921 it was largely an academic subject, not something about which an everyday person was particularly concerned or interested. Part of the reason was that, well into the industrial revolution, the thought that a machine might be anything more than capable would have seemed ridiculous.

In 1921, Czech writer Karel Čapek debuted the play R.U.R., the initials standing for Rossumovi Univerzální Roboti (Rossum’s Universal Robots, in English). The play proved very popular, was translated into 30 languages within three years, and — significantly — introduced the word “robot.” This work had a singular impact on subsequent science fiction.

Science fiction is precisely where the masses get their information about robots and machine intelligence. In other words, they are informed by popular culture, not by academic study.

That may not be a bad thing. Unlike scientists of any ilk, creative writers are not beholden to reality. For example, a common theme in fictional space epics is starships capable of faster-than-light speeds, which contemporary physics says can’t happen. When a writer has a scientific background to go along with creative skills, interesting things can happen. Arthur C. Clarke wrote about many technologies that did not exist at the time but do today, satellites being just one.

I, Robot book cover

Sci-fi writers were also able to conjure up robots, and they have. The most influential writer to do so was Isaac Asimov, educated at Columbia University and a professor of biochemistry at Boston University. In 1942, he published the short story “Runaround” in Amazing Science Fiction magazine; here, his famous Three Laws of Robotics were introduced. In 1950, “Runaround” was included in his anthology I, Robot (left), which is usually cited as the source of the Three Laws. (Image courtesy Del Rey Books)

In 1951, 20th Century Fox brought us the film The Day the Earth Stood Still. It’s interesting to watch, although many consider it a weak anti–nuclear bomb screed. The robot, Gort, is member of an interstellar police force that will destroy any planet that threatens the peace of the universe. Gort’s actions in the movie are mildly consistent with the Three Laws, but see the Zeroth Law below.

In 1956, MGM released the film The Forbidden Planet. One of its “stars” is Robby the Robot. In one memorable scene, Robby is ordered to do something contrary to his programming and is saved from self-destruction only by his owner’s cancellation of the order. To this writer’s knowledge, it is the first movie presentation of the Three Laws in action.

In 1966, Robert A. Heinlein’s novel The Moon is a Harsh Mistress was published. The story is about a revolution in the Earth’s prison colony on the moon, but a critical sub-story is the presence of a huge, self-aware computer nicknamed “Mike,” who runs everything. Mike has a sense of humor, yet another sub-story. For many reasons, including Heinlein’s superb artistry regarding social norms, Moon is an enormously entertaining book that won the Hugo Award in 1967. For hard (technical) sci-fi fans (as opposed to those who prefer fantasy), Mike is the central figure. However, Mike acts as he wants to; the Laws don’t seem to be present.

In 1968, MGM released Stanley Kubrick’s brilliant film 2001: A Space Odyssey. A central character in the movie is the HAL 9000 computer (called simply “HAL”), who went rogue. By the ’60s, more and more people were becoming aware of computers and their capabilities; it was not a stretch to think that HAL and all the other technology shown in the film would actually be in place just 33 years hence. After all, humans were a mere year away from setting foot on the moon.

In 1977, George Lucas’s Star Wars hit the big screen, with “droids” littered everywhere. It’s never quite clear whether the droids are sentient, but it’s abundantly clear they can be highly intelligent. In this fan’s opinion, no Laws. Except for the cuddly ones, droids are portrayed as appliances.

We weren’t done with HAL. In 1984, Peter Hymans brought Authur C. Clarke’s second novel in the series, 2010: Odyssey Two, to the film 2010: The Year We Make Contact, — again with HAL playing a central role. This time, however, HAL’s inventor, Dr. R. Chandra (brilliantly portrayed by Bob Balaban), proves that HAL did not go rogue but was conflicted by having been lied to prior to the 2001 mission. Spoiler alert: HAL deliberately killed a human being in 2001 because he was told to put the mission before all other concerns.

These are just a few of the instances of intelligent machines in popular culture; there are hundreds more. But a big change has come about in academia, as it became clearer that artificial intelligence would eventually be a thing. To grasp the full extent of this, take a glance at the omnibus article Ethics of artificial intelligence (Wikipedia). Be warned — it’s a rabbit hole.

The Three Laws

Asimov described a way to provide an ethical and moral basis for the behavior of robots, primarily to protect humans. Here are the Three Laws from “Runaround,” as stated in I, Robot:

  1. A robot may not injure a human being, or, though inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second laws.

Much later, in Asimov’s 1984 novel Robots and Empire, the so-called Zeroth Law was added (although it had been discussed in the 1950 short story “The Evitable Conflict”). But it was not added by humans; robots, which by then had been given responsibility for governance, “evolved” to add it themselves.

  1. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

The other three laws were adapted for proper precedence.

Asimov tinkered with the exact wording of the Laws throughout his fiction, but the Zeroth Law is a game-changer. The first incarnation of the laws was specifically designed to keep humans safe from robots, but the Zeroth Law allows a robot to harm a human being if the good of humanity as a whole is served. That good is determined by — who else? — robots.

Asimov postulated that the Laws would be hard-wired into what he called the “positronic brain.” Think of that as an unalterable BIOS, a hard-wired neural network that cannot be altered by the robot. In effect, an order to the robot would be put through that neural network, which regulated whether the robot would obey the order. In case of conflicts between the Laws (which Asimov wrote about at length in other stories), the positronic brain would fry.

In effect, the Laws were the fundamental moral code for the robots’ behaviors.

Ethics?

The Three Laws of Robotics have been debated ever since they first appeared. One writer suggested that the laws were entirely bogus because the technology does not exist to put them into place. But that would be like saying Arthur C. Clarke was delusional about satellites because they didn’t exist and couldn’t be created at the time he wrote about them, or that Jules Verne was completely off base about submarines.

Still, it appears that we do not know how to put appropriate restraints on the AI systems that are emerging. Even if a computer could be fitted with a device that would analyze the Laws and assure destruction of a machine gone bad, it’s all in software — a machine without the laws could still run it, or the device could be hacked.

The backlash over Bing AI’s threats is a very loud, public reaction to this very problem. These systems have no ethics, no moral code. We can’t tell if they have our (human) interests at heart. It’s not as if Microsoft came back with “Hey, we’re sorry about that, and we’ve adjusted the knobs and levers behind the curtain to increase the AI’s moral sensitivity.” Instead, Microsoft put limits on how we interact with Bing AI by reducing the extent of the interactions and identifying off-limit topics. In other words, no ethical fixes — just tweaks to help Microsoft keep us from noticing.

Microsoft is highlighted here because of its bungled release of Bing AI. But make no mistake — all the other big tech players, as well as dozens of other firms, will be pushing AI-fueled systems to us soon. Worse, government will surely adopt similar systems for its interactions with citizens. Perhaps that could be a good thing, with the AI providing even-handed implementation of government regulations. But without that underlying ethical code, it’s easy to think that such systems could become just as corrupt as some humans.

We really need to be careful about that Zeroth Law, just as we must be cautious about the AI-based systems we will encounter over the next few years.

Talk Bubbles Join the conversation! Your questions, comments, and feedback
about this topic are always welcome in our forums!

Will Fastie is editor in chief of the AskWoody Plus Newsletter.


FROM THE FORUMS

Ten top forum topics — Support

Talk Bubbles

The section of the forums devoted to support has long been the most active area of all. Did you know that the Support section alone accounts for over 158,000 topics with over 885,000 comments?

We appreciate both the questions and the ongoing willingness of forum members to offer their knowledge, experience, and assistance to help others solve their problems. So check out the Support section from time to time — you never know when you might have that unique insight that will help someone else.

In the first quarter of this year, over 400 new questions were asked and over 4,000 replies were posted. Here are the ten most active Q&A sessions from that period.

By the way, don’t forget to log in before posting!


ADVERTISEMENT
.COM for just $6.98 at Namecheap


Here are the other stories in this week’s Plus Newsletter

LEGAL BRIEF

Max Stul Oppenheimver

Your call is very important — to you

By Max Stul Oppenheimer, Esq.

You may have had the experience. You sign up for a service simply by clicking on a link, then wait on hold endlessly to solve a problem or cancel the service.

It may be small comfort, but you are not alone. The Federal Trade Commission (FTC) has recognized the pervasiveness of the phenomenon and has proposed a new rule to deal with it.

FREEWARE SPOTLIGHT

Deanna McElveen

Opal — Now I need a nap

By Deanna McElveen

Last year, a Gallup survey of over 3,000 adults in the U.S. reported that only 32% of us get “excellent” sleep; 35% get “good” sleep, and 33% of us get “poor” or “fair” sleep.

The one thing that has helped calm my mind so I can fall asleep is sound. There is simply no way I can fall asleep in a quiet room. Give me that pan flute CD from the swap meet, or an episode of “Frasier,” and I’m out in 10 minutes!

PATCH WATCH

Susan Bradley

How do you install and patch your new computer?

By Susan Bradley

I’m doing something vastly different this week.

Right off the bat, you’ll notice that this article is a bit shorter than I usually write. That’s because it describes the actual writing task to which I’ve set myself. I’ve prepared two checklist documents about setting up a Windows PC, one for Windows 10 and one for Windows 11.


Know anyone who would benefit from this information? Please share!
Forward the email and encourage them to sign up via the online form — our public newsletter is free!


Enjoying the newsletter?

Become a PLUS member and get it all!

RoboForm box

Don’t miss any of our great content about Windows, Microsoft, Office, 365, PCs, hardware, software, privacy, security, safety, useful and safe freeware, important news, analysis, and Susan Bradley’s popular and sought-after patch advice.

PLUS, these exclusive benefits:

  • Every article, delivered to your inbox
  • Four bonus issues per year, with original content
  • MS-DEFCON Alerts, delivered to your inbox
  • MS-DEFCON Alerts available via TEXT message
  • Special Plus Alerts, delivered to your inbox
  • Access to the complete archive of nearly two decades of newsletters
  • Identification as a Plus member in our popular forums
  • No ads

We’re supported by donations — choose any amount of $6 or more for a one-year membership.

Join Today buttonGift Certificate button

The AskWoody Newsletters are published by AskWoody Tech LLC, Fresno, CA USA.

Your subscription:

Microsoft and Windows are registered trademarks of Microsoft Corporation. AskWoody, AskWoody.com, Windows Secrets Newsletter, WindowsSecrets.com, WinFind, Windows Gizmos, Security Baseline, Perimeter Scan, Wacky Web Week, the Windows Secrets Logo Design (W, S or road, and Star), and the slogan Everything Microsoft Forgot to Mention all are trademarks and service marks of AskWoody Tech LLC. All other marks are the trademarks or service marks of their respective owners.

Copyright ©2023 AskWoody Tech LLC. All rights reserved.