Search

Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its New Child Abuse Detection Tools - Gizmodo


The concern here obviously isn’t Apple’s mission to fight CSAM, it’s the tools that it’s using to do so—which critics fear represent a slippery slope. In an article published Thursday, the privacy-focused Electronic Frontier Foundation noted that scanning capabilities similar to Apple’s tools could eventually be repurposed to make its algorithms hunt for other kinds of images or text—which would basically mean a workaround for encrypted communications, one designed to police private interactions and personal content. According to the EFF:

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

Such concerns become especially germane when it comes to the features’ rollout in other countries—with some critics warning that Apple’s tools could be abused and subverted by corrupt foreign governments. In response to these concerns, Apple confirmed to MacRumors on Friday that it plans to expand the features on a country-by-country basis. When it does consider distribution in a given country, it will do a legal evaluation beforehand, the outlet reported.

In a phone call with Gizmodo Friday, India McKinney, director of federal affairs for EFF, raised another concern: the fact that both tools are un-auditable means that it’s impossible to independently verify that they are working the way they’re supposed to be working.

“There is no way for outside groups like ours or anybody else—researchers—to look under the hood to see how well it’s working, is it accurate, is this doing what its supposed to be doing, how many false-positives are there,” she said. “Once they roll this system out and start pushing it onto the phones, who’s to say they’re not going to respond to government pressure to start including other things—terrorism content, memes that depict political leaders in unflattering ways, all sorts of other stuff.” Relevantly, in its article on Thursday, EFF noted that one of the technologies “originally built to scan and hash child sexual abuse imagery” was recently retooled to create a database run by the Global Internet Forum to Counter Terrorism (GIFCT)—the likes of which now helps online platforms to search for and moderate/ban “terrorist” content, centered around violence and extremism.

Because of all these concerns, a cadre of privacy advocates and security experts have written an open letter to Apple, asking that the company reconsider its new features. As of Sunday, the letter had over 5,000 signatures.

However, it’s unclear whether any of this will have an impact on the tech giant’s plans. In an internal company memo leaked Friday, Apple’s software VP Sebastien Marineau-Mes acknowledged that “some people have misunderstandings and more than a few are worried about the implications” of the new rollout, but that the company will “continue to explain and detail the features so people understand what we’ve built.” Meanwhile, NMCEC sent a letter to Apple staff internally in which they referred to the program’s critics as “the screeching voices of the minority” and championed Apple for its efforts.

Adblock test (Why?)

Article From & Read More ( Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its New Child Abuse Detection Tools - Gizmodo )
https://ift.tt/3lJV5EM
Technology

Bagikan Berita Ini

0 Response to "Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its New Child Abuse Detection Tools - Gizmodo"

Post a Comment

Powered by Blogger.