In this issue ON SECURITY: Get ready for AI Additional articles in the PLUS issue MICROSOFT 365: All the places a “missing” email can be hiding FREEWARE SPOTLIGHT: AM-DeadLink — Because nothing lasts forever WINDOWS 11: Windows Terminal arrives? ON SECURITY Get ready for AI
By Susan Bradley • Comment about this article Not a day goes by that we don’t hear about some new technology using AI. Whether we like it or not, vendors are going to be slapping a coat of AI on just about everything to ensure it gets in front of us. We already know that many in our readership do not want AI in their technology. But what if you do want to embrace it? What are some things you need to concern yourself about, or at least be aware of before you start using it? It knows only what it has access to
On a test system, I recently enabled what I consider to be “the whole enchilada.” Translation: I enabled Microsoft 365 Copilot. With a $300-per-year, up-front investment, I’m reviewing the risks and rewards. Here are some of my early takeaways. Whether your data is stored in a traditional active directory or locally on your computer but not synchronized or saved to OneDrive, Microsoft can see only those email communications going through its cloud-based email services. Make no mistake — Microsoft can scan the contents as the emails go through its mail system, just as every other email vendor does. Don’t ever consider email a secure method of transport. It’s not. Never has been. I know there aren’t Microsoft engineers paid to read every one of my emails. But I also know there are computer systems that can summarize much of it — driven, of course, by AI technology. The more information that goes through the Microsoft cloud, the more the company can make AI-based recommendations and suggestions in the software. On the flip side, the more you have on your own local storage, the less the AI can learn. Thus we have the first lesson of AI. Understand what the AI software is built to be looking at, and understand that if that information is not in the cloud, you won’t get the full benefit of AI services. Sometimes a user has access to something on the network, but the system administrator didn’t realize those rights had been granted. So it was interesting to me when I saw indications of file access, with filenames shown in the recaps. It’s not uncommon in business for a filename to be indicative of the file’s contents and thus provide hints. For example, the filename “Termination letter for Joe Shmoe.docx” clearly refers to a sensitive human-resources and payroll matter that should be “eyes only” until it takes effect. Just seeing the filename leaks the information. Prior to having AI summaries available, proper permissions were not set on the document or containing folder — and no one was the wiser. But now your “AI assistant” can go poking around, exposing information you had no intention of sharing. Thus the second lesson of AI: Review how you’ve set up your permissions and file structure, and see if you need to make changes accordingly. What about consumers?
Microsoft is offering Copilot Pro for USD$20 per user per month for, as it says, “individuals, creators, and power users looking to take their Copilot experience to the next level.” This is intended for subscribers to Microsoft 365 Personal and Family plans. It offers the same sort of system that will review what comes into your inbox and what you are doing about it. It also offers suggestions. Sometimes these suggestion displays get in the way. For example, you will find that in the Outlook app on your smartphone, Copilot offers to help you draft your email. Launch Word, and you will see a new Draft with Copilot box that you must dismiss in order to create a document by yourself (Figure 1).
One thing I noticed about Copilot is that its assumptions aren’t always correct. I was composing a simple letter to someone, apologizing for using the wrong computer image in a document. Because I used the word “photo” rather than the more specific phrase “Computer image,” I suddenly found myself writing a letter apologizing for taking the wrong wedding photos. What? How did I get to a wedding? “AI hallucination” is a real thing, which you must guard against. The other day, I was listening to a presentation about using AI in business. The speaker stated that he had answered several emails using ChatGPT. When I asked whether he had disclosed his use of AI to the client, he said no. This behavior is now making the news in a somewhat circular fashion, in that news outlets themselves are being called out when they fail to disclose the use of AI. His admission, abetted by our editor’s reminder that AskWoody does not yet have a formal AI policy, spurred me to action. I am working on AI policies for the business, the site, and the newsletter so you will always know where we stand. Will gets one or two offers for “guest posts” in the newsletter; just as he always declines those, he will decline AI-generated content. The next thing I learned was that Copilot requires the monthly release of Microsoft 365 and not the semi-annual channel that I normally recommend. Because Copilot is still a work in progress, only the monthly releases ensure that you get recent changes, which may involve important fixes. I also recommend upgrading to Windows 11 23H2 to fully embrace the Copilot integration. (Nevertheless, I still recommend 22H2 for the less adventurous. Let the dust settle a bit.) So far, from my limited review of Copilot activated on a Windows 10 PC but not actively used, and with no installation of a Microsoft 365 Office plan that includes a Copilot subscription, I have not seen any indexing services being run in the background that would make me think that Microsoft is snooping. If you are among those who want no trace of an AI on your system but Copilot gets installed anyway, I don’t see it doing anything until you actually launch the software. Copilot on Mac?
If you are over there on your Apple devices, thinking that you can sit out all this AI nonsense, guess again. First, let’s be honest: Siri has been AI-like for many years. Apple has long integrated intelligent assistive technology into its platform while emphasizing its strong stand on privacy. But if you use various browsers on your Apple platforms, you too will be getting AI prompts. You can even download a Copilot app from the Apple store and install it on your phone, or tablet. You can even trick your Mac to install it. Just navigate to the iPhone and iPad apps tab in the Apple app store and search for “Microsoft Copilot.” In addition, if you or your business subscribes to Microsoft 365 Copilot or Copilot plus, the Office applications on the Apple platform will also offer recaps and suggestions. (Note: Figure 1 shows Word on a Mac, not a PC!) Bottom line: If you are licensed for Microsoft Copilot on its various platforms, and if you use Office applications on your Apple devices, you won’t be immune from Copilot-ness. Microsoft truly wants this to be massively cross-platform. Recommendations for businesses
If you do plan to roll out Copilot, make sure that sensitive data is properly limited to those needing access as part of their job. Remember, Copilot utilizes the user’s access and thus has whatever access the user does. You’ll want to use tools, such as Microsoft Purview, to review what the system will have access to. As with many things Microsoft these days, additional tools needed to properly handle and control cloud deployments are not included in the base subscriptions. You may need to subscribe to Business premium (in the case of small businesses) or an E3 or E5 license in order to gain access to sensitivity labels in order to properly align what data you don’t want to be seen. Running in the background is something called the Graph API. This searches Microsoft 365 content and then adds information from ChatGPT when prompted by a user. As an administrator, you must be aware that whatever content Microsoft Graph exposes to the service will be part of the response. So content in Microsoft Graph — such as emails, chats, and documents that the user has permission to access — will be included. If you aren’t aware that the user has access to something, know that it will be slurped into the database of information that the service will use to provide summaries and recap content. Currently, each Microsoft app stands on its own, so Word can’t interact with Excel. In Word, you’ll get suggested content if you ask it to write a letter about a topic. As I mentioned earlier, exercise caution by not taking generative content at face value — it may be wrong. Outlook will similarly provide suggestions and also recap a series of conversations between you and your correspondents. Because I have a forensic background, I always like to review what is needed to investigate technology and its use. Perhaps you want to know if a user in your organization has improperly used AI or created a response using AI when your policy prohibited this. You can still do this, but it comes with a Catch-22 — you must sign up ahead of time for premium tools in order to search for Microsoft Copilot for 365 content. As noted in Microsoft Learn’s post Search for and delete Microsoft Copilot for Microsoft 365 data: You can use eDiscovery (Premium) and the Microsoft Graph Explorer to search for and delete user prompts and Microsoft Copilot for Microsoft 365 responses in supported applications and services. This feature can help you find and remove sensitive information or inappropriate content included in Copilot activities. This search and deletion workflow can also help you respond to a data spillage incident, when content containing confidential or malicious information is released through Copilot-related activity. What license is required to investigate? The most expensive Microsoft license — E5. Although Microsoft indicates that you can use the 90-day Purview trial to conduct your analysis if you are not an E5 customer, I’d argue that a business must ensure that it can manage Copilot and investigate when necessary. To do that on a moment’s notice, you can’t depend on a trial. The ability to search and delete is offered only in the business Copilot version. I am not aware of any such capability in the consumer version. Policy first, technology second
No matter who you are — business or individual — you must understand the AI policies set forth by companies you are doing business with. I was glad to see that an answer posted in a Microsoft forum disclosed that it had been augmented with AI and that it had been reviewed by a human for clarity and correctness. I’ve now conditioned myself to review anything Bing search or its Chat returns from a search. Here’s a tricky example. I asked a question about Word on iOS and received an answer that referred only to Word on Windows. In a way, that makes sense because Word on Windows is dominant, with many, many more users. However, the answer was not quite correct for the iOS version. Don’t blindly trust the answers given. If you are considering any AI technology, assume you are in a testing phase, not a fully deployed phase. Computers aren’t always right. Testing, plus disclosures and policies, should be the top priorities for any business before letting the monster out. If you want to remove those icons from your computers so that users (or yourself) aren’t tempted to dabble with Copilot, we have full instructions for consumers and for businesses. If you don’t want it in your Office apps (including Word, Outlook, Excel, and PowerPoint), don’t sign up or purchase the separate Copilot service for the Office suite. Then neither your Windows nor iOS platform will include Copilot. Resources
Susan Bradley is the publisher of the AskWoody newsletters.
The AskWoody Newsletters are published by AskWoody Tech LLC, Fresno, CA USA.
Your subscription:
Microsoft and Windows are registered trademarks of Microsoft Corporation. AskWoody, AskWoody.com, Windows Secrets Newsletter, WindowsSecrets.com, WinFind, Windows Gizmos, Security Baseline, Perimeter Scan, Wacky Web Week, the Windows Secrets Logo Design (W, S or road, and Star), and the slogan Everything Microsoft Forgot to Mention all are trademarks and service marks of AskWoody Tech LLC. All other marks are the trademarks or service marks of their respective owners. Copyright ©2024 AskWoody Tech LLC. All rights reserved. |