Privacy: Apple defends measures against child pornography

Privacy: Apple defends measures against child pornography

Does Apple’s system for discovering child pornography on iPhones open the door to government surveillance? The company promises to know how to prevent this

Apple counteracts fears that its announced system for tracing child pornography photos could be abused for surveillance.

The company will reject any attempts by governments to abuse the process to search for other content, Apple assured on Monday.

The iPhone company announced measures for more child protection last week. This also means that initially only in the USA images with child pornographic material should be discovered on users’ devices if they use the in-house online storage service iCloud for photos.

It is not about analyzing the content of all existing images. Instead, a file with so-called hashes of known child pornographic content should be loaded onto the devices – a kind of digital fingerprint of the image. This means that a copy of the photo can be recognized when comparing with special procedures, but the original cannot be restored from the hash.

If there is a match, suspicious images are provided with a certificate, thanks to which Apple can exceptionally open them after uploading them to iCloud and subject them to an examination. The system only sounds the alarm when there is a certain number of hits. How many there must be for this is not made public.

If child pornographic material is actually discovered during the check, Apple reports this to the American non-governmental organization NCMEC (National Center for Missing & Exploited Children), which in turn can call in the authorities.

Some cryptography experts and IT security researchers, among others, had criticized the fact that the very creation of such a system opens up the possibility for authoritarian governments to require Apple to include other content in the database with the hashes and thus to carry out political surveillance. Well-known pictures of the violently suppressed protest on Tiananmen Square in Beijing in 1989 were cited as an example.

“We have previously been asked to weaken the protection of user privacy with changes prescribed by governments and we have steadfastly refused,” wrote Apple in questions and answers after the criticism. “We will continue to reject this in the future.” The process is also designed in such a way that hashes of other content cannot be secretly smuggled onto the devices. The database consists exclusively of known images of severe child abuse that are confirmed by child protection organizations.

Source Link

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts