Techrecipe

Electronic Frontier Foundation “opposes Apple’s search for child abuse images”

In early August, Apple announced two mechanisms for introducing child abuse countermeasures to iPhones and Macs. One is to notify parents by alerting you if your child attempts to receive or send sexual photos in the Messages app. Another is that it can discover CSAM images stored in iCloud and notify authorities.

In response, the Electronic Frontier Foundation EFF issued a statement opposing such a large-scale monitoring plan and appealed to iPhone users to sign a petition. EFF is a non-profit organization based in the United States that aims to advocate for civil liberties in the digital world. Not long ago, I expressed support for limiting application tracking in iOS 14 and, although not anti-Apple, expressed opposition to privacy issues shortly after Apple announced its CSAM measures.

In a statement, the EFF argued that mass surveillance, no matter how well-intentioned, was unacceptable as a strategy to combat crime. What’s more, the new Apple policy is appealing to those who resent the next version of iOS as installing all iPhone monitoring software.

According to the statement, last year, EFF backers blocked the EARN IT bill, a government plan to enable scanning of all messages online. The bill here was submitted by a U.S. senator in March 2020 that would make it mandatory for tech companies to support decryption of encrypted data if a warrant is obtained from law enforcement agencies. The point was that it was rejected as a cryptocurrency backdoor bill.

The EFF also stated that Apple’s plans should be defined as enabling photo scanning on all iPhones and should not be expressed as EARN IT legislation. EFF has two search systems that Apple intends to introduce to the iPhone. One is a system that scans photos uploaded to iCloud and compares them to CSAM databases held by groups including the quasi-governmental organization NCMEC for law enforcement agencies to investigate crimes against children.

The other, which works when a guardian is selected, will scan iMessages sent by minors and match all kinds of sexual and explicit material search algorithms, and the iPhone will notify you and sometimes your parents when an image of that kind is detected.

For reference, when announcing these two measures after criticism of Apple’s management’s CSAM measures, he acknowledged that Apple had created a misunderstanding that the Messages app was censored, and reflected that it would have been better to clarify the difference. The EFF statement again criticized the system as a threat to privacy and security, and criticized the iCloud image scan as infringing on the privacy of all iCloud photo users and matching the government-created CSAM secret database. The view is that parental notification scans in messaging applications are breaking the promise of end-to-end encryption.

The EFF does not take into account the number of abusive parents of Apple’s policies, and points out the risk of an authoritarian government pushing for more censorship. In response to the criticism so far, Apple counters that iCloud photo scans do not apply to videos and cannot be used by the government. However, once such a structure is created, it is difficult to completely deny the possibility of being diverted as a backdoor, and it has already been proven that Apple, which is only a company, has no choice but to comply with local laws.

The original series of criticism and confusion can be attributed to the fact that Apple unilaterally took steps to protect personal information for the iPhone, which has already become a global platform, without consulting with external child abuse prevention organizations. In the future, Apple may be forced to add new explanations or make concessions. Related information can be found here.

Meanwhile, according to reports, it has been found that Apple has already been searching for CSAM content in iCloud email since 2019.

According to a previously published page (Our Commitment to Child Safety), Apple uses image data technology to detect and report child exploitation, and, like a spam filter, its systems detect allegations of child exploitation and review matches individually. , it is confirmed that It is said that there is a sentence that any account with child exploitation content will be free of charge for violating the terms of use.

Apple’s chief privacy officer even said at a tech conference in 2020 that Apple was using screening technology to find illegal images. They said they would deactivate their accounts if they found child exploitation images, but did not go into details on how they would be discovered.

When I contacted Apple, Apple first said that it had never scanned iCloud photos. In 2019, he admitted that he had scanned attachments for CSAM while sending and receiving iCloud emails. The point is that iCloud email is not encrypted, so Apple servers could easily scan it. Apple also said that it had conducted limited tests on other data, but did not specify what kind of tests it did. Related information can be found here.

lswcap

lswcap

Through the monthly AHC PC and HowPC magazine era, he has watched 'technology age' in online IT media such as ZDNet, electronic newspaper Internet manager, editor of Consumer Journal Ivers, TechHolic publisher, and editor of Venture Square. I am curious about this market that is still full of vitality.

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most discussed

%d 블로거가 이것을 좋아합니다: