Newsletter Archives
-
MS-DEFCON 2: Continuous dribbles
ISSUE 21.23.1 • 2024-06-06 By Susan Bradley
When will it stop raining?
Remember the old Star Trek episode titled “The Trouble with Tribbles?” Well, we’ve got trouble with dribbles, leading me to raise the MS-DEFCON level to 2.
Changes are coming to Windows, both 10 and 11, but not necessarily to everyone. I find this to be the most annoying part of recent updates to Windows. This “you may see it; you may not see it” is part of nearly all recent changes. If you want to trigger this after installing the June updates, you’ll just have to be patient.
Anyone can read the full MS-DEFCON Alert (21.23.1, 2024-06-06).
-
Apple to enhance Siri privacy protection
From Nathaniel Parker:
Apple has recently made a statement concerning a series of privacy enhancements to Siri as a followup to Apple’s halting of employees listening to Siri requests as part of their “grading” program
After briefly mentioning how Siri protects customer privacy in its current iteration and a brief description of how Siri’s “grading” program works, Apple issued an apology for how it has not fully communicated the current “grading” program, has reiterated that the current program is now halted, and has also announced that the program will be resumed in the fall after a software update (likely in iOS 13 and the other major Apple operating system updates that utilize Siri).
When the “grading” program resumes in the fall, the following changes will be made, according to Apple’s statement:
- First, Apple will no longer retain audio recordings to help improve Siri. Apple will, however, continue to use computer-generated transcripts to help improve Siri.
- Second, Apple will allow customers to opt-in to help improve Siri by learning from their audio samples. Those who choose to opt-in can also choose to opt-out anytime, and Apple will apply strong privacy controls to this collected data.
- Third, when customers do opt-in to help improve Siri by learning from their audio samples, only Apple employees (not third-party contractors) will be able to listen to the audio samples. Apple employees will also work to delete audio samples which are determined to inadvertently trigger Siri.
Two points Apple did not specifically include in the statement are:
- Whether customers can choose to opt-in or opt-out of allowing Apple to use computer-generated transcripts to help improve Siri. From the reports I have read on other Apple and tech news sites, it sounds as though Apple will continue to use computer-generated transcripts to help improve Siri, without the ability for customer’s to opt-out (although the data should be randomized as to not tie it to a user’s personal information according to Apple’s current iteration of Siri’s privacy protections).
- Whether customers will need to upgrade to iOS 13 (or the other major Apple operating system updates that utilize Siri) to take advantage of the new “grading” program opt-in. I am concerned especially for those on older Apple hardware that cannot upgrade to the latest operating system updates and wonder if Apple would possibly address such concerns in minor updates to older Apple operating system releases.
In general, I trust Apple’s privacy stance with Siri more than I do Apple’s competitors.
With Apple’s competitors such as Amazon (Alexa), all of my Alexa recordings are stored in Amazon’s servers and tied to my Amazon account (although I can delete any of my recordings anytime).
It is good, however, that Apple is addressing concerns with and is being forthcoming with the current Siri “grading” program and making the necessary adjustments this fall. I hope Apple will clarify the other two points above, and I look forward to seeing how Apple fully rolls out the new privacy enhancements this fall.
I respect Apple for working hard to keep privacy at the forefront of the customer experience, and it another reason I enjoy using Apple’s products and services.
-
Apple’s revelations about keeping/scanning Siri recordings demand a response
Excellent article out this morning from Johnny Evans in Computerworld.
You may have heard on Friday the Guardian assertion:
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant
For a company that touts its privacy superiority, that’s clearly way over the line. Even I was shocked – and I’ve been jaded by years of Microsoft’s snooping.
This morning, Johnny Evans published a clear plan for fixing the wrongs:
- Apple should introduce much clearer and easier to understand privacy warnings around use of Siri on its devices.
- When setting up Siri on a new device you as a user should be given the chance to explicitly reject use of your voice for any purpose other than the original request.
- Apple should bring this [contracted human snooping] work in-house, become completely accountable for what its voice workers and management do with these recordings, and ensure customers have some way in which to punish any infraction of their data privacy.
- In the event Siri is invoked but no specific request is made, the system should be smart enough to ignore the interaction and delete any recording made as a result of that interaction.
- Only in those instances in which different voice recognition systems can’t find a way to agree on what is said should human ears be necessary.
It’s an excellent article. Windows users take note.