newsletter banner

ISSUE 21.19.F • 2024-05-06 • Text Alerts!Gift Certificates
You’re reading the FREE newsletter

Susan Bradley

You’ll immediately gain access to the longer, better version of the newsletter when you make a donation and become a Plus Member. You’ll receive all the articles shown in the table of contents below, plus access to all our premium content for the next 12 months. And you’ll have access to our complete newsletter archive!

Upgrade to Plus membership today and enjoy all the Plus benefits!

In this issue

COMMENTARY: Ethics and computing

Additional articles in the PLUS issue

WINDOWS 11: Settings

FREEWARE SPOTLIGHT: DesktopOK — No more screaming when your icons get scrambled

PATCH WATCH: The Windows 11 disconnect


ADVERTISEMENT
The Nokbox

The Nokbox

Estate Planning & Organization

If something were to happen to you tomorrow, would your next of kin be prepared to manage all of your assets, finances, and wishes?

They will if you have a Nokbox: a Next of Kin box.


www.thenokbox.com


COMMENTARY

Ethics and computing

Michael A. Convington

By Michael A. Covington Comment about this article

Computer ethics and AI ethics are easier than you think, for one big reason.

That reason is simple: if it’s wrong to do something without a computer, it’s still wrong to do it with a computer.

See how much puzzlement that principle clears away.

Consider, for example, the teenagers in several places who have reportedly used generative AI to create realistic nude pictures of their classmates. How should they be treated? Exactly as if they had been good artists and had drawn the images by hand. The only difference is that computers made it easier. Computers don’t change what’s right or wrong.

In fact, one warning sign of people cutting ethical corners is that they start saying, “The computer did X” rather than “We used the computer to do X.” That’s a bit like saying the hammer hit your thumb. It didn’t do it by itself. People do things to people, using computers.

What is computer ethics?

If that’s all there is to it, why is computer ethics or AI ethics even a subject? Haven’t we reduced all of it to the ethics we already know?

No, and again there’s a big reason. Computers put people in unfamiliar situations, where they don’t know how to apply ethical principles. Those unfamiliar situations are what we need to explore.

Is it a separate world?

Some people actually seem to think that when they’re at the computer, they’ve left Planet Earth and are in a separate, unreal world. Some are genuinely unsure whether social media connect them to real people or to characters in a game (or people pretending to be characters in a game). Multiplayer games are good entertainment but are not a good model of everything else that happens on the computer.

Others turn off their ethical common sense when faced with a technical challenge. Tricking a machine — in the sense of getting it to work in an unforeseen way — is, of course, not wrong. But tricking people is wrong, and some computer users pay too little attention to the humans on the other end of their software.

Some people actually want to deny that right and wrong even apply to the computer world. “There are no laws in cyberspace,” wannabee hackers used to say to me in the 1990s, along with “I don’t believe in cyber ethics.” Even law-enforcement agents were confused. Some told me there didn’t seem to be a law against breaking into computers remotely. To which I replied, “Nonsense! The laws of our state don’t mention stealing elephants, but that doesn’t mean it’s okay to steal an elephant.” In the case of computer hacking, it’s wrong to tamper with someone else’s machine, even if it’s done by computer network. That’s a matter of both law and ethics.

Authority and accuracy

People often attribute too much accuracy and authority to computers. In the early days of business computing, the first thing everybody learned was “The computer never makes a mistake.” That led to a common 1960s ethical fumble: “The computer says you owe us money, so pay up!”

Nowadays, we know software and input data can be erroneous — but information handled by computer still has an air of authority. It is also, quite often, hard to verify; can anyone find out who originally put the information in and why it was thought to be accurate? Audit trails, a basic part of business data processing, were forgotten early in the microcomputer era and have not necessarily come back everywhere they’re needed.

Also, the ease of transferring information by computer, and the generally technical rather than humanistic mindset of IT specialists, can lead people to underestimate the ethical importance of confidentiality and accuracy. People can be harmed if their private information gets out, or if due care is not taken to keep it accurate. Thinking from the computer’s point of view can lead technical people to overlook the human consequences and risks of inaccuracy. A software bug can be more than a technical failure; it can ruin lives, as recently happened when faulty accounting software at the British Post Office caused hundreds of employees to be prosecuted for theft.

Then there is data security. Is your thumb drive the equivalent of a box of papers that ought to be in a vault? Then why is it in your pocket? To a technician, it may just be files, but to other people, it’s private information.

A related issue is outsourcing responsibility to the customer. The point of Web 2.0 is to make customers do their own clerical work. This cuts costs and can increase accuracy, but do they understand their responsibilities? “Don’t write down your password” is just the start of it. When things go wrong — as they inevitably will — are there good ways to recover? Or are the technicians poised to blame the victims?

Finally, in social media and email, we have all encountered fake messages that look very authentic. Partly I blame the architecture of the Internet, which was designed for research labs with the assumption that everyone is trustworthy; basic concepts such as verifying the origin of every data packet are still missing.

But partly it’s the nature of digital messages themselves. They consist of nothing but bits, and they can be copied flawlessly. It is easy to copy anything and change one detail, with no degradation of the rest. That wasn’t the case with paper documents, and people still haven’t adapted. That’s why we have phishing.

Judging a message by its appearance is an old habit that dies hard. Half a century ago, a letter from Bank of America would be on Bank of America stationery, which an impostor could not easily obtain. High-quality printing was expensive enough that any well-produced pamphlet was generally from an accountable source. It might be a strange or quirky source, but it wouldn’t be an impersonator. Malicious instructions from a faker telling you, in a very professional way, to do something wrong, were rare.

Not anymore. I now have to warn people that the type on the screen looks just as neat, whether it’s telling the truth or lying, and that trademarks and document formats are easy to duplicate with no blurring or degradation. Things aren’t true just because they appear on the computer.

And that last point is a broader one. Computers are seen as sophisticated, modern, and important. People want to feel sophisticated, modern, and important. That makes people want to fit in by trusting information that comes from, or through, a computer. That bias can lead to bad decisions.

Mass communication

One big way computers put people into new ethical situations is by opening up mass communication to ordinary people. A social-media user can easily reach an audience as big as a small-town newspaper’s, but such a person often has little awareness of the responsibilities that go with it — or, often, even the size of the audience.

This creates two challenges. First, many acts that were already unethical used to be contained by not being able to reach many people. Petty slander and gossip are examples. If the audience is only a few close friends, and they are duly skeptical, the harm is mitigated. But if it becomes a social-media posting that is circulated to tens of thousands, that’s an entirely different matter. Misinformation, whether malicious or simply negligent, can now harm far more people, and everyone has more responsibility.

In fact, some people on social media fall under what I call the small-circle-of-friends illusion — that what they post will be seen only by a few like-minded people, perhaps just the ones they are thinking about at the moment. That’s not how it works, yet some people fall into such a mindset repeatedly. It reminds me of old stories of people picking up an early telephone receiver and yelling into the mouthpiece, with no regard to who is actually listening, “Tell Martha I’ll be late for dinner!” as if the machine itself knew how.

Second, some acts become unethical only when scaled immoderately. Hardly anyone would mind receiving a cold sales call once a month from a recognized local business. But with modern technology, each of us could get a robocall every minute, night or day, if it weren’t illegal. (And still, plenty of businesses break the law.) Moderation is an underappreciated part of ethics, partly because it’s hard to codify.

In the case of spam and robocalls, again I blame the design of the Internet and the telephone system. Costs aren’t imposed on the people who choose to consume resources. If sending an email cost the sender even one cent, there would be a lot less spam. The same goes for making robocalls. The technology to make them existed in 1965 (and was used in automated burglar alarms), but people didn’t get robocalls in their homes because long-distance calls cost money. Mechanisms to impose costs on the right people would also make them identifiable, which would be a good thing.

Artificial intelligence changes the ethical scene further. I’ll discuss why next time.

Talk Bubbles Post comment button Contribute your thoughts
in this article’s forum!

Michael A. Covington, Ph.D., is a retired faculty member of the Institute for Artificial Intelligence, The University of Georgia, and now an independent consultant. He has authored books and articles about computers, electronics, linguistics, and amateur astronomy.


ADVERTISEMENT


Here are the other stories in this week’s Plus Newsletter

WINDOWS 11

Ed Tittel

Settings

By Ed Tittel

The Settings app in Windows 11 remains endlessly under development, with the transition from Control Panel and Microsoft Management consoles far from complete.

With the introduction of Windows 8, Microsoft began a slow and deliberate changeover in how setup, configuration, and related settings are handled. In this first of a series of stories about the Settings app and Control Panel, we’ll take a long, hard look at Settings and describe where Control Panel still appears under the Settings umbrella.

FREEWARE SPOTLIGHT

Deanna McElveen

DesktopOK — No more screaming when your icons get scrambled

By Deanna McElveen

I have two large monitors and a lot of icons filling those screens. It may look like utter chaos, but my brain knows exactly where each icon is located.

So when some poorly crafted software leaves my icons all neatly stacked to the left side of my main monitor, I want to neatly stack some skulls!

DesktopOK by Nenad Hrg (one of my favorite developers) can put your mind at ease by allowing you to save the layout of your desktop icons. If things go south and your icons get messed with, you can quickly restore them. Want different layouts for different screen resolutions or users? No problem! DesktopOK can save as many backups as you want.

PATCH WATCH

Susan Bradley

The Windows 11 disconnect

By Susan Bradley

Despite my being a CPA, earnings calls are not usually a part of my technology coverage for Patch Watch.

I’m making an exception. I read the transcript of the Microsoft Fiscal Year 2024 Third Quarter Earnings Conference Call and found myself concerned with CEO Satya Nadella’s remarks. In the call, Nadella addressed recent problems Microsoft had encountered with security — from the company itself being hacked due to its own lack of attention to OAuth, to attackers breaking through using various other means.


Know anyone who would benefit from this information? Please share!
Forward the email and encourage them to sign up via the online form — our public newsletter is free!


Enjoying the newsletter?

Become a PLUS member and get it all!

RoboForm box

Don’t miss any of our great content about Windows, Microsoft, Office, 365, PCs, hardware, software, privacy, security, safety, useful and safe freeware, important news, analysis, and Susan Bradley’s popular and sought-after patch advice.

PLUS, these exclusive benefits:

  • Every article, delivered to your inbox
  • Four bonus issues per year, with original content
  • MS-DEFCON Alerts, delivered to your inbox
  • MS-DEFCON Alerts available via TEXT message
  • Special Plus Alerts, delivered to your inbox
  • Access to the complete archive of nearly two decades of newsletters
  • Identification as a Plus member in our popular forums
  • No ads

We’re supported by donations — choose any amount of $6 or more for a one-year membership.

Join Today buttonGift Certificate button

The AskWoody Newsletters are published by AskWoody Tech LLC, Fresno, CA USA.

Your subscription:

Microsoft and Windows are registered trademarks of Microsoft Corporation. AskWoody, AskWoody.com, Windows Secrets Newsletter, WindowsSecrets.com, WinFind, Windows Gizmos, Security Baseline, Perimeter Scan, Wacky Web Week, the Windows Secrets Logo Design (W, S or road, and Star), and the slogan Everything Microsoft Forgot to Mention all are trademarks and service marks of AskWoody Tech LLC. All other marks are the trademarks or service marks of their respective owners.

Copyright ©2024 AskWoody Tech LLC. All rights reserved.