Blog Consumer data privacy

How Apple Is Contributing to Digital Pollution By Eroding Privacy

How Apple Is Contributing to Digital Pollution By Eroding Privacy

You know that feeling in the morning when you realize that you need to wake up because your dream is just getting truly absurd? It felt like an odd dream as I read last week that WhatsApp, the company that had sold out to Facebook, was rightfully criticising Apple for weakening privacy on its devices.

WhatsApp has never had a great track record on privacy. While the content is encrypted end-to-end, WhatsApp tracks metadata such as who is calling whom, and when. If you’re a whistleblower who called a journalist, regardless of what you say, the metadata alone is enough to compromise you. Metadata is hugely valuable and is an important reason why Facebook paid $19 billion to acquire the company.

In contrast, Apple had taken a strong stance on privacy in the past. It had even opposed a court order to hack a user’s phone in 2016 saying, “The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers.” Apple’s new approach to privacy, unfortunately, represents a major shift to this stance - essentially the company is caving into government pressure to open a backdoor to the iPhone.

Apple frames this as a feature to detect child abuse, explaining that in the new versions of iOS and iPadOS that will be released later this year, “the system performs on-device matching using a database of known child sex abuse material (CSAM) image hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.” When multiple images are found on the phone that match the database, the images will be verified by a human and authorities will be notified.

Surveillance under a false premise

It’s hard to argue with the goal of protecting children. But while the technology may be matching images provided by NCMEC today, the way Apple has built the system, it could be matching images provided by any government or authoritarian regime tomorrow. This feature gives governments the backdoor they’ve always wanted. It’s a system that’s ready to be tweaked based on governments’ demands on what content should be flagged or censored.

In Dec 2019, Apple and Facebook were grilled in the U.S. Senate Judiciary Committee hearing on their encryption practices. Senators cited child abuse and shootings to demand that the companies make encrypted user data accessible to law enforcement. Senator Lindsay Graham threatened, “You’re going to find a way to do this or we’re going to go do it for you.”

We’d be naive to think that protecting people is the real motivation behind such a threat. Senator Lindsay Graham has taken a clear position on free speech when he said in 2011, “Free speech is a great idea, but we’re in a war.”

In 2013, when Edward Snowden revealed NSA surveillance, Graham said on Fox & Friends that he was glad for the surveillance:

“I’m a Verizon customer. I don’t mind Verizon turning over records to the government if the government is going to make sure that they try to match up a known terrorist phone with somebody in the United States. I don’t think you’re talking to the terrorists. I know you’re not. I know I’m not. So we don’t have anything to worry about.”
Senator Lindsay Graham


The problem with the argument of having nothing to hide

Apple uses this same argument when defending its new feature that erodes privacy:

“If you’re storing a collection of CSAM material, yes, this is bad for you. But for the rest of you, this is no different.”
Erik Neuenschwander, Chief Privacy Engineer, Apple

The first problem with this argument is that it implies that you shouldn’t be advocating for privacy, unless of course you have something to hide… Without privacy, you have mass surveillance which precludes free speech. Free speech (and therefore privacy) is a necessity to hold authorities accountable and to continue to be a democracy. By making you feel mildly uncomfortable for saying that you value privacy, this argument works to ensure that you don’t stand in solidarity with journalists and human rights activists who are challenging the status quo. We need to say in a unified voice that we all need privacy and that mass surveillance is not acceptable.

The second problem with arguing that this feature is only bad for you if you’re storing a collection of CSAM is that it requires you to trust Big Tech that your content will only be scanned for CSAM. That’s a big leap of faith to assume that Apple could scan for anything, but wouldn’t allow governments to abuse the access to unencrypted data to scan for other content and quash dissent.

A New York Times investigation revealed how Apple’s actions in China have already demonstrated that when having to choose between profits and protecting customers, the company chooses profits. Giving in to pressure from the Chinese government, Apple compromised its Chinese users’ data by storing it on servers run by a state-owned Chinese firm. The digital keys to unlock information on those computers were stored in the same data centers.

Avoiding Digital Pollution by Protecting Privacy

Collecting any user data comes with responsibility. Our products often collect data, including users’ shopping preferences, habits, location, and so on. When in doubt, we err on the side of collecting more data just in case it reveals additional insights.

Unfortunately, when we store data, we cannot guarantee that we will be able to protect it. Even with the best of intentions, our products can erode privacy by collecting a wealth of data that is ripe for analysis and use in mass surveillance. As a result, our products often create collateral damage to society by eroding democracy.

Just as industrial growth has created collateral damage to the environment in the form of industrial pollution, carefree growth in the digital era has led to digital pollution that frays the fabric of our society. Our carefree approach to collecting user data is an example of digital pollution.

What was absurd about WhatsApp’s criticism of Apple is that it reminded me of one oil giant criticizing another for environmental pollution. Both Whatsapp and Apple are contributing to digital pollution, the first by storing metadata that can be used by Facebook and the second by creating a backdoor to encryption.

To create a cleaner digital footprint, companies need to protect privacy (and democracy) by reducing the metadata they collect and ensuring end-to-end encryption.

Finally, as consumers, each of us has a role to play in reducing digital pollution. You can make choices that protect your privacy as a way of being a responsible citizen of a democratic society. One way you can exert your choice is by quitting WhatsApp and Apple Messenger and switching to Signal, an open-source messenger with end-to-end encryption that doesn’t store metadata.

Newsletter