Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

Apple reportedly will scan iPhones for child abuse imagery

A person holding an Apple iPhone and looking at it.

Apple will soon include software on iPhones in the U.S. that will scan the devices for child abuse imagery, according to a new report.

The Financial Times reported the news, citing sources.

Matthew Green, a security expert and associate professor at Johns Hopkins Information Security Institute, also tweeted about Apple's decision on Wednesday. Green has worked with Apple in the past to patch issues that could have allowed hackers to decrypt photos and videos in iMessage.

Green, also citing sources, said the scanning would be "client-side," or done on an individual's iPhone. In a follow-up tweet, Green said the system would be used on a phone's photo library and only if you have iCloud Backup turned on. So it would "only scan data that Apple's servers already have." The system would use a hashing algorithm to match photos on a phone to known child abuse images.

"These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear," Green tweeted.

Green notes that someone might be able to make "problematic images that 'match' entirely harmless images."

"Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file?," Green wrote.

However, Green also noted that eventually the scanning "could be a key ingredient in adding surveillance to encrypted messaging systems," which is why he argued Apple's move would be a "really bad idea."

https://twitter.com/matthew_d_green/status/1423077918444507139

"I don’t particularly want to be on the side of child porn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends," Green tweeted.

U.S. law enforcement like the FBI and Department of Justice have long urged tech companies to create backdoors for them to access encrypted messaging, often citing the need to do so to find people who share child abuse material.

Meanwhile, privacy and tech advocates have also long noted that adding a backdoor for authorities could inevitably lead to that backdoor being exploited.

Apple Insider notes that Apple has not confirmed the plan.

The post Apple reportedly will scan iPhones for child abuse imagery appeared first on The Daily Dot.

Enregistrer un commentaire

0 Commentaires