Close Search
 
MEDIA, JOBS & RESOURCES for the COMMON GOOD

Apple is building surveillance capabilities into its products. Here’s why you should care.


2 September 2021 at 8:11 am
Maggie Coggan
Over 90 global digital and human rights groups are calling on the tech giant to abandon the announced changes. We dive into what these changes are, and why so many groups are worried. 


Maggie Coggan | 2 September 2021 at 8:11 am


0 Comments


 Print
Apple is building surveillance capabilities into its products. Here’s why you should care.
2 September 2021 at 8:11 am

Over 90 global digital and human rights groups are calling on the tech giant to abandon the announced changes. We dive into what these changes are, and why so many groups are worried. 

Last month, Apple announced plans to build surveillance capabilities into iPhones, iPads and other Apple products.  

The company said that these capabilities – which would scan images sent via iMessage for sexually explicit content – were intended to protect children and reduce the spread of child sexual abuse material (CSAM). 

But the changes have a number of digital and human rights organisations across the globe worried. So much so that more than 90 of these organisations recently came together to publish an open letter calling on Apple to abandon the proposed changes.  

The letter said that these new surveillance capabilities would not counter the spread of CSAM, and would instead censor protected speech, threaten the privacy and security of people around the world, and have a disastrous impact on many children. 

So what exactly are these changes and why are digital and human rights groups worried? We take a look. 

What exactly are these surveillance capabilities? 

Every image has what is called a hash – a string of letters and numbers that uniquely identify an image, sometimes referred to as a “digital fingerprint”.

Apple has now created an algorithm called NeuralHash, which can identify known CSAM by hashing iCloud photos and matching the hashes against a database of image hashes provided by child protection organisations such as the National Centre for Missing and Exploited Children.

Once an image is flagged, the photo is shown to an Apple employee to confirm the content is CSAM, before the account owner is reported to law enforcement. A notice is also sent to the organiser of a family account whenever a user under the age of 13 sends or receives a flagged image.

For anyone who wants to know more about the ins and outs of how this works, Apple has published the details of the new security features in a 36-page technical summary.

Why is Apple trying to make these changes? 

Apple is one tech service provider that does end-to-end encrypted messaging which means that the contents and data messages are completely private and can’t be accessed by law enforcement agencies, intelligence agencies, or even Apple themselves. 

But there has been mounting pressure from governments and law enforcement agencies to provide a way for the contents of messages to be accessed when a crime is being committed. 

The other motive is that tech providers themselves, such as Apple, Samsung and Google, are concerned about the amount of CSAM being shared on their platforms.  

As Lucie Krahulcova, the executive director of Digital Right Watch, explains, it’s something that tech companies haven’t been able to deal with. 

“Unfortunately over the past few years, there just hasn’t been a good mechanism to deal with the problem on a large scale,” Krahulcova told Pro Bono News.  

“It’s something that no one wants to see online on their [platforms].” 

She also said that it’s an issue that the wider population is becoming increasingly concerned as there is a somewhat misguided belief that the problem is getting worse

“Some would argue that there has been an increase of that material, which I’m not sure is the case. I think we’re just seeing a lot more of it because the reporting and monitoring of those images is being done better,” she said. 

Why are digital and human rights groups worried? 

Digital and human rights organisations argue that algorithms designed to detect sexually explicit material are notoriously unreliable.

Apple has said previously that the chance of a false positive is about one in one trillion accounts.  

But Krahulcova said that because the algorithm is looking for a string of numbers and letters, and not a specific image, there’s a pretty good chance it will get it wrong and could flag content like art, health information, educational resources and advocacy messages. 

Signatories of the letter also point out that the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. 

This isn’t always the case. The adult in charge of the account may be abusive, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. 

Lastly, there is the fear that once this backdoor feature is built in, governments could convince Apple to extend surveillance to other areas and to detect images that are objectionable for reasons other than being sexually explicit.

The letter said that those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of politicians pressuring tech companies to scan for the images. 

So, is there an alternative? 

Krahulcova believes the solution lies in improving targeted surveillance –  that is directed towards specific persons of interest –  rather than trying to deploy mass surveillance.

She said that there are ways to monitor suspicious online activity through software that mimics screens, or that tracks words that people are typing on their screens. And while these methods are invasive, this approach would limit who is targeted and mean innocent people are not caught up inadvertently. 

“I would go with the more targeted tools, even though they’re a bit more sophisticated, because the mass surveillance measures that Apple is trying to introduce are not the solution,” she said. 


Maggie Coggan  |  Journalist  |  @MaggieCoggan

Maggie Coggan is a journalist at Pro Bono News covering the social sector.


Get more stories like this

FREE SOCIAL
SECTOR NEWS


YOU MAY ALSO LIKE

Navigating Compliance for Not-for-Profits: A Practical Guide

The Breakthrough Office

Wednesday, 13th November 2024 at 4:50 am

Improving your social impact reporting

Kevin Robbie

Monday, 29th May 2023 at 5:36 pm

Salary Survey reveals pay rises across the board

Danielle Kutchel

Monday, 29th May 2023 at 5:00 pm

Think Strategy: Think Impact

Kevin Robbie

Thursday, 20th April 2023 at 11:00 am

pba inverse logo
Subscribe Twitter Facebook
×