We need to talk about how Apple is normalizing surveillance
Updated: Oct 13
Apple has taken a public stand on privacy, curtailing data abuses by apps and declaring it doesn’t exploit its users’ information. But it has also created comprehensive new ways to track us
Among tech giants, Apple stands out for its insistence on offering privacy features instead of joining its competitors in the exploitation of users’ personal data. Apple CEO Tim Cook recently clashed with Facebook on exactly those grounds – making a point, during a podcast interview, of defending privacy over the social network’s data-hungry ad-based model. When it comes to privacy, the maker of the iPhone is presenting itself as a good actor. But is that actually the case?
Apple recently rolled out a smorgasbord of new security features: Apple devices now block tracking pixels embedded in emails, tell users how many times an app has accessed sensitive data, utilise a relay to mask web traffic, and allow for the creation of unique email aliases. All of these are praiseworthy tools – long overdue – to protect our privacy.
What’s more, judging by its success as the world’s biggest company by market capitalization, Apple is demonstrating that privacy can sell. About 94 per cent of American users opted out of data collection when Apple gave them the choice. Given its reach and clout, Apple has been able to do more for privacy in one software update than what most governments around the world have done in years (although it seems that some apps are still tracking non-consenting users). With a single decision, Apple can improve privacy standards globally. But, before we give it too much credit, we should take a critical look at the company’s general direction of travel. Many of Apple’s latest features are about enhancing surveillance – even if Apple would never call them that. The new iPhone operating system, iOS 15, can digitize text in photos, enabling users to copy and paste text from an image, or call a phone number that appears in a picture. Scanning nearby buildings with an iPhone will make Maps recognize them and generate walking directions. Algorithms will identify objects in real-time video, and it will be possible to turn photos into 3D models for augmented reality. And users will now be able to carry their IDs in their phone. All of these features increase the amount of data collected.
Apple is also active in the lucrative business of healthcare. Using their iPhones and Apple Watches, people can track their steps, heart rate, and gait, among other things. A new sharing tab on the Health app even lets users share their health data with family and caregivers. Granted, all that data is supposed to be kept secure – but whenever sensitive information is collected and shared that easily, data disasters are just lurking around the corner. Indeed, once one starts scratching the surface, Apple’s contribution to the development of invasive technologies and the normalization of surveillance becomes evident. Apple created the Bluetooth beacons tracking people in shops, gyms, hotels, airports and more by connecting to their phones. Apple’s usage of Face ID as a way to unlock the iPhone has contributed to normalizing facial recognition. Its AirTag – a small device that can be stuck to personal items in order to track them – has caused concerns among privacy advocates that they will make it easier to track people. The Apple Watch, as the most advanced wearable on the market, leads us one step closer to under-the-skin surveillance, which can read our bodies and emotions. Most recently, Apple has developed a tool that can scan photos in people’s devices in search of child abuse material. While the objective is noble, the tool could be used for less ethical purposes and, according to security expert Bruce Schneier, it effectively breaks end-to-end encryption – the most powerful way we currently have to protect the privacy of our devices. (Apple later decided to pause its plans to roll out the tool.) When it comes to privacy, iOS arguably has a better reputation among consumers than Android, as does Siri vs Alexa, and Safari vs Chrome. But that doesn’t give Apple permission to track our lived experience at all times with its microphones, cameras and sensors. Apple’s groundbreaking devices are pushing the limits of what technology companies can track, and that is not good news for privacy. Thanks to Apple, physical shops can track us through our phones, hackers can potentially access our most sensitive health and biometric details, and now it has developed a technology that can scan content that was supposed to be encrypted. Apple has been playing two games at once – protecting privacy and developing surveillance tools – while only acknowledging the former. All tech giants share a desire to digitize the world. What is left unsaid by Apple and others is that digitizing the world entails surveilling it: recording everything, making it taggable, trackable, searchable – and hackable. Of course, asking tech companies not to digitize the world is like asking builders not to pave over natural areas. Unless society sets limits, that is not going to happen. That’s why governments create protected areas when it comes to building.
We need similar protected areas when it comes to surveillance. Conquering more of the offline world is an easy way for tech giants to keep growing. It’s in their nature to turn the offline into online, the analogue into digital. But turning everything into a potential spy is a threat to freedom and democracy. Surveillance leads to societies of control, which in turn leads to dwindling freedom. When we know we are being watched, we self-censor, and when others know too much about us, they can predict, influence and manipulate our behavior.
If Apple is serious about privacy, it should offer an iPhone model for the privacy-conscious: one without facial recognition and without encryption-breaking tools, one in which it is easy to cover the camera, and in which the microphone can be mechanically turned off, among other features. If we want to keep our democracies in the digital age, we need to set limits on what gets tracked.
Carissa Véliz is an Associate Professor at the Faculty of Philosophy and the Institute for Ethics in AI
Updated 11.10.19, 15:00 GMT: The headline of this article has been updated