Apple’s child-abuse scanner has serious flaw, researchers say

  • Thread starter Thread starter The Verge RSS
  • Start date Start date
T

The Verge RSS

Guest
Author: Russell Brandom

acastro_180604_1777_apple_wwdc_0003.0.jpg

Illustration by Alex Castro / The Verge

Researchers have found a flaw in iOS’s built-in hash function, raising new concerns about the integrity of Apple’s CSAM-scanning system. The flaw affects the hashing system, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures.

On Tuesday, a GitHub user called Asuhariet Ygvar posted code for a reconstructed Python version of NeuralHash, which he claimed to have reverse-engineered from previous versions of iOS. The GitHub post also includes instructions on how to extract the NeuralMatch files from a current macOS or iOS build.

“Early tests show that it can tolerate image resizing and compression, but not...

Continue reading…

Continue reading...