Newsletter Archives
-
The time has come for AI-generated art
ISSUE 22.15 • 2025-04-14 Look for our BONUS issue on April 21, 2025!! MEDIA
By Catherine Barrett
The horse may have five legs, but it’s already out of the barn.
AI-generated images are here to stay, and we need to learn how to recognize them and use them legitimately. They’re not authoritative depictions of how things look, but they are handy for illustrating ideas. In what follows, I’ll tell you how they work and address ethical and practical concerns.
Read the full story in our Plus Newsletter (22.15.0, 2025-04-14).
This story also appears in our public Newsletter. -
Microsoft wants to hear from you
I received this in an email from Microsoft this morning. It does not appear to be an April Fool’s joke. Take the company at its word — if you get this survey request, tell Redmond what you think.
I was struck by two things about the mostly multiple-choice survey. First, most of the available choices tended to favor Microsoft. For example, in a set of five choices, three could be construed as favorable to the company while only two were unfavorable. That’s not balanced.
The other was the following question: “How long have you used Copilot for?” I’m no English savant, but this would not have escaped my attention while editing an article for AskWoody. It would certainly have been rewritten by Roberta Scholz. I decided to ask Copilot to copy edit the sentence and it provided “How long have you been using Copilot?”
This suggests that the survey was written by a human, albeit one whose English-language skills are slightly below average.
-
Microsoft 365 changes, and Copilot
MICROSOFT 365
By Peter Deegan
Microsoft has made huge changes to its 365 consumer plans, including the intrusive addition of Copilot into Word, Excel, PowerPoint, and Outlook.
It’s the biggest transformation of Microsoft 365 Personal and Family (Home) plans for over a decade. Worse, it’s led to inevitable misinformation and screwy advice on social media.
These are changes that all Microsoft 365 customers need to understand.
Read the full story in our Plus Newsletter (22.08.0, 2025-02-24).
-
The Casio question
I want to write a letter to Casio in the US. I knew neither who the CEO is or where the US headquarters are. So, I decided to ask Copilot via Bing.com:
“Who is the president of Casio US, and what is the US headquarters’ mailing address?”
I then verified the answers from alternate sources. The mailing address Copilot returned was correct. The CEO was wrong. Copilot gave me Makoto Ori, who did assume that position in 2021. But in 2023, a new CEO was named, Tomoo Kato, a fact I established using ordinary search and reading further. Copilot’s source was a news article from 2021.
Having found the new name, I just typed it to Copilot. It responded by stating that Kato-san was the current CEO.
Copilot knew both names but gave me the wrong one when I asked. What are we to make of this?
-
The mustache question
We’ve been having something of a debate around here regarding the use of AI. Susan, as you well know from her numerous posts and columns about it, warns about jumping into Microsoft’s Copilot service. I agree that caution is warranted, and one of the first things I reported here was the abusive threat Bing’s chat feature made to a reporter. I have a slightly different take, based on the premise that these AI services and features are here whether we like them or not, and whether we fear them or not.
I conducted a very simple and brief experiment based on questioning two assistants. I asked Bing search, “Who was the last US president to have a mustache?” Bing did more or less what I expected — it gave me a list of results, with the first being a link to a Wikipedia article about facial hair on all presidents. I saw nothing on the first page with the answer. Further research would have been necessary.
Then I asked Copilot. It’s response? “William Howard Taft.” It provided several citations in support. Of course, that’s the correct answer, and Copilot did the research for me. Annoyingly, Copilot’s response included a question to me: “Is there a particular reason you’re interested in presidential facial hair?” I was tempted to tell it “None of your business.”
So, I ask you: Which assistant do you think did a better job?
-
Saying no to patches
ISSUE 22.03 • 2025-01-20 PATCH WATCH
By Susan Bradley
Both Apple and Microsoft are providing updates and options that are unnecessary.
The good news for you Apple users is that the company is not taking a page out of Microsoft’s forced-change model and instead is letting us easily opt out of AI features. Clearly, it learned from its 2014 blunder — forcing the U2 album Songs of Innocence to iTunes on all iPhones.
When you receive a pop-up on your Apple device that supports Apple Intelligence, you get a “Not now” option that allows you to easily dismiss the request. For now, Apple’s AI is still somewhat limited and covers only writing, email, and Siri. More AI capabilities are to come later, but it’s good to see that we can easily opt out.
Read the full story in our Plus Newsletter (22.03.0, 2025-01-20).
This story also appears in our public Newsletter. -
Microsoft 365 and Office in 2024 and beyond
MICROSOFT 365
By Peter Deegan
Let’s do a low drone pass over another year of innovation and hype in Microsoft 365 and Office.
Amazingly, there were some non-AI highlights.
As I review what happened in 2024, I’ll also provide a few notes about what to watch out for in 2025.
Read the full story in our Plus Newsletter (21.53.0, 2024-12-30).
-
No, Microsoft isn’t stealing your data to feed Copilot
MICROSOFT 365
By Peter Deegan
Social media “experts” are touting a false “fix” to stop Microsoft from using your Word, Excel, or PowerPoint files to train Copilot AI. Microsoft has only itself to blame for customers being suspicious.
According to this rumor, Microsoft quietly turned on a way to scrape Word and Excel documents to train its Copilot AI system. It then suggests a way to “opt out” of this “new” intrusion. Supposedly, disabling the “Connected Services” in modern Office (File | Options | Trust Center | Trust Center Settings | Privacy Options) will stop Microsoft from spying on documents and using them to train Copilot.
Not true.
Read the full story in our Plus Newsletter (21.52.0, 2024-12-23).