• Apple plans to break its end-to-end encryption

    Home » Forums » Newsletter and Homepage topics » Apple plans to break its end-to-end encryption

    Author
    Topic
    #2386681

    PUBLIC DEFENDER By Brian Livingston Apple Computer shocked computer-security experts when the Cupertino company announced on August 5 that it plans to
    [See the full post at: Apple plans to break its end-to-end encryption]

    3 users thanked author for this post.
    Viewing 14 reply threads
    Author
    Replies
    • #2386749

      Thanks Brian for a thorough and insightful article about what Apple is doing with their Cloud Accounts and devices. This is not click-baiting — it is serious tech news, as I have come to expect from the AskWoody site and Newsletter.

      So much for Apple’s promises to put user privacy first, even at the expense of butting heads with law enforcement.

      I do hope Apple reconsiders its latest foray into the Surveillance Economy. Even the most well-intended breaches of privacy and device or account security can have far-reaching implications for user security, privacy and liberty. This is an important issue, and there are few if any easy answers.

      As an Apple non-user, I am not immediately impacted by this development. But as a user of other Cloud based data sharing, storage and email services, I am concerned about where this Surveillance Economy is going.

      There has to be a better way to allow law enforcement without infringing on Liberty, both in the US and around the world.

      -- rc primak

      3 users thanked author for this post.
    • #2386836

      Perhaps the old-school technique of phone-line tapping of those suspected of having committed, or planning to commit, a serious enough crime, if and only if authorized by a judge, might still be sufficient, adapted for its use online, in this tech-besotted world we live in?

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

      1 user thanked author for this post.
    • #2386845

      Krawetz explains that “Facebook submitted 20,307,216 reports to NCMEC” in 2020 alone. Were 20 million child abusers arrested? Of course not.

      Perhaps not, but 20 million child abusers were identified and are now subject to arrest.  This is a lack of capacity on the part of law enforcement, not a lack of guilt on the part of those who have those images.

      Clearly, you have an opinion about what Apple is doing but I trust you also understand that your opinion is not shared by everyone.   You might be surprised how many do not share it.

    • #2386869

      MHCLV941:

      This paragraph supports your claim. It follows immediately after the one you have copied:

      The use of technology to automatically submit reports of suspected images has already overwhelmed law enforcement. In 2019, NCEMC received “69.1 million images, videos, and other files,” according to a Dayton Daily News article. “There are more cases than law enforcement can potentially work,” says John Shehan, a vice president of the agency.

      I find the following at times not very informative and a other times rather contradictory and in some parts besides the point and, therefore, not very helpful in understanding the future implications of Apple’s actions:

      Assigning this much power to software guarantees it will expand” (Subtitle)

      Since the FotoForensics site knows how to create the NCMEC hashes, it compares every photo that users upload against more than 3 million hashes it’s obtained from NCMEC and various law-enforcement agencies. In the past six years, only five images out of 5 million or so have matched a hash in the image bank. One of these so-called child p*** (*) images, Krawetz reports, “was a fully clothed man holding a monkey.”

      So what happens to those photos that are vetted and found to be not child p*** ? Are they kept in a data base, or deleted if the AI checks prove them irrelevant?

      Harvard’s Schneier points out in a blog post that governments exploit four genuine problems to panic citizens into approving Big Brother surveillance schemes that would otherwise be rejected by thinking citizens:

      Beware the Four Horsemen of the Information Apocalypse: terrorists, drug dealers, kidnappers, and child p***s. Seems like you can scare any public into allowing the government to do anything with those four.

      OK. Maybe relevant.

      Schneier reminds us that even the US Department of Defense emphasized in 2019 the need for end-to-end encryption software to protect vital secrets. The DoD statement says:

      As the use of mobile devices continues to expand, it is imperative that innovative security techniques, such as advanced encryption algorithms, are constantly maintained and improved to protect DoD information and resources. The Department believes maintaining a domestic climate for state of the art security and encryption is critical to the protection of our national security.

      So?

      Many countries — from China to Russia to India and beyond — are already using technology for surveillance and control of their restive populations.

      True, but how is that Apple’s fault? What is the connection of this to scrutinizing with software only (I hope) the uploaded photos? What does this surveillance by those
      governments have to do with anything here?

      (*) Trying to please the bad language filter to see it it can be persuaded to post my comment or not.

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

      2 users thanked author for this post.
    • #2386908

      True, but how is that Apple’s fault? What is the connection of this to scrutinizing with software only (I hope) the uploaded photos? What does this surveillance by those
      governments have to do with anything here?

      That’s the ‘cry wolf’ by EFF…. These governments, and others, will surely force Apple to scan for oppositions figures, non-liked journalists, reactionaries, words like ‘Tiananmen’ ‘Poo the Bear’…in iCloud and on Apple devices…

      NO, Apple doesn’t plans to break its end-to-end encryption.

      1 user thanked author for this post.
    • #2386918

      According to the article in the Newsletter:

      Future Apple operating systems will be loaded with an image bank of child p***, with all images “hashed” into unreadable digital signatures. When an Apple device backs up images to iCloud Photos — which is “on” by default — the device will generate a hash for each image. Apple says the system kicks in if a user backs up to iCloud 30 or more images with hashes that are close to ones in the p*** collection at NCMEC (the National Center for Missing and Exploited Children) and at least one other organization. In that event, an Apple employee manually reviews the alert, disables the user’s iCloud account, and reports the user to NCMEC. That nonprofit organization may then file a case with the Federal Bureau of Investigation or local police.

      ““This is a security disaster,” said cryptography expert Bruce Schneier, a Harvard Kennedy School Fellow. “It opens the door for all sorts of other surveillance, since now that the system is built it can be used for all sorts of other messages. And it breaks end-to-end encryption, despite Apple’s denials.”

      Putting the above together, it is clear to me, reading the first quoted paragraph, that Apple is going to add something to its operating system that will compare to partial hashes in a big file it will install in the HD or SSD and, in this case, a photo, or photos in the users’ message already in the mail client Out-ox, before it is sent, so before it is encrypted, and send a message to Apple’s child p*** cop on-the-beat employee, if there are too many matches between what is being sent and photos of child p****. Therefore, strictly speaking,  this future modification by Apple to macOS does not break end-to-end encryption, while the title of the Newsletter says it does.

      What Apple developers are working on that is worrying to some, for the reasons explained in the second paragraph I have quoted above, is to demonstrate a system that can be used to check the emails or messages sent by the user to others before the messages are sentthat, in principle, can also be used to check for anything, not just certain type of photos. Now, this is not going to be the first system ever that is meant to be used for such purpose, even if the design is an original one, maybe. But assuming the Chinese government, or whoever, will decide to copy it with nefarious intent is not necessarily correct, as they already must have plenty of know-how and means, and pretty sophisticated software of their own among those means, for keeping a Big-Brother’s eye  on their people.

      Also such governments have no time at all for end-on-end encryption of private individuals’ messages by Apple or whoever, and besides, regardless of encryption, they can grab hold of any and all their messages with absolute ease. And, if it suits such governments, they can share their own snooping technology with any nations and groups they want.

      So this development is worrying. But is this “a security disaster”? Saying that it is, at least for now, is going too far, I think.

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2387055

      the old-school technique

      We have more on that coming.

    • #2387073

      So not using icloud would get around this?  It seems a bit nonsensical to even waste time developing this database and including it in future IOS releases. Maybe this is a case of “it’s the thought that counts.”  There must be a better way.

      • #2387079

        DriftyDonN: As far as I know, this is not related to iCloud. It is about some software and database that is going to be put inside a Mac when its user gets either an update or an upgrade (not sure which) of the macOS, not now, but at some future time. It is an idea people are having at Apple, that has bobbed up recently to other people’s attention, not a current and present danger. So if you do not have a Mac, you need not worry. I have a Mac, and I couldn’t care less. Because I like children, but not that way, and, if necessary, I think I can get enough good character testimonials to show in court — assuming that many judges will take cases coming from a dubiously constitutional detection procedure. A good thing, because people can be misidentified by Apple’s filters. Besides, I am not the worrying kind.

        And if this really gets bad, then I’ll get myself a PC with Linux installed, and go from there. I know enough Linux to do that.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

        1 user thanked author for this post.
    • #2387168

      As far as I know, this is not related to iCloud

      It is related to iCloud and uploading/backing up images only.
      The other part is checking images in incoming / outgoing messages in iMessage, by minors.
      In this case the minor will receive a warning that he/she are about to view/send an improper image. If he/she open/send the image a notice will be sent to the parents.. (no image will be sent)

      • #2387275

        True enough. And only a possible, sometime in the future, issue for Mac users alone, as it would work only with something to be installed on Macs, and only on Macs, by Apple itself, assuming this ever happens, as the future is unknown to mere mortals like us.

        If one has a Windows or a Linux machine, not a problem.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2387336

      If one has a Windows or a Linux machine, not a problem

      If one has a Windows or a Linux machine and uses ANY cloud service one’s images are “scanned” for CSAM (providing one hasn’t encrypted images before uploading).
      If one has a Window or Linux smartphone and uses Facebook, Twitter, Instagram… images are “scanned” CSAM.
      If one has a Mac and uses Facebook, Instagram, Twitter…or uses any cloud service images are “scanned” for CSAM.

      1 user thanked author for this post.
      • #2387421

        Alex: I think we are commenting on different things. I am explaining what happens if Apple’s plans to install telltale software with database in Macs were to go ahead and who would be affected because of that. Other cloud issues are not Mac issues exclusively, true enough. But they are not Mac issues brought about by a possible Apple modification of macOS, which is what I was commenting about in my answer to DiftyDonN.

        To be free of those other intrusive searches of ones’ emails you have mentioned, the remedy is to stay away from the Cloud as much as possible, as I do, something that is possible for many, the same as it is for me. It ultimately comes down to what people choose to do and how to do it. As to those who must use the cloud for some compelling reason, well, we all have to take risks sometimes, that’s how it is.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

        • #2387485

          OscarCP said”To be free of those other intrusive searches of ones’ emails you have mentioned, the remedy is to stay away from the Cloud as much as possible,”

          My thoughts exactly. I mistakenly thought IPhones were included.

          Thanks!

    • #2387544

      To be free of those other intrusive searches of ones’ emails you have mentioned

      I havn’t mentioned any emails and Apple doesn’t scan images emails on any device for CSAM. Only images uploaded to iCloud. That won’t change and no government will demand it.

      1 user thanked author for this post.
      • #2387620

        You are right. While writing, I was preoccupied with emails for other reasons, and I was, incorrectly in this case, thinking of photos sent with emails, which is not relevant. But never mind that: this is an issue of importance that I feel merits being commented about and I do stand by the main thrust of my opinions on it. That any photos sent to iCloud, or Azure, or wherever, are scanned there is not the issue I am discussing and, I believe, we are supposed to be concentrating on here. That is a different important issue that merits, I think, its own thread.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2387545

      That won’t change and no government will demand it.

      From your lips to Putin’s and Xi Jinping ears…

      1 user thanked author for this post.
      • #2387547

        They haven’t done so far while every cloud service scans images for CSAM and for nothing else.
        So there is nothing new with the system which run now for years.

      • #2387628

        They might not need to demand it from Apple, or whomever, if they already have what they need for that. I would be surprised to learn they don’t, and still need help from Apple to either get it, or to implant it with mandatory/sneaky “apps” or by some other means in everyone’s iPhones, at least within their respective realms.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2387595

      Apple delays rollout of CSAM detection system and child safety features

      “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

      4 users thanked author for this post.
      • #2387635

        It’s nice to know that ONE major tech company is not deaf and blind to what its customers have to say.   Now, is there a really GOOD audiologist and a really GOOD optometrist in Redmond????

        1 user thanked author for this post.
    • #2387729

      before releasing these critically important child safety features

      So critically important that we can put them on the back burner for a few months!

      cheers, Paul

      1 user thanked author for this post.
    • #2387730

      before releasing these critically important child safety features

      So critically important that we can put them on the back burner for a few months!

      cheers, Paul

      Some people are just never happy.

      1 user thanked author for this post.
      • #2387836

        And some people may be more worried about lawsuits that could be expensive in more ways than one, than about someone else’s child protection concerns. And good intentions (Apple’s in this case, I would like to believe) are said to pave the road to (legal) Hell.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

        3 users thanked author for this post.
    Viewing 14 reply threads
    Reply To: Apple plans to break its end-to-end encryption

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: