果冻影院

XClose

果冻影院 News

Home
Menu

Opinion: Privacy is not the problem with the Apple-Google contact-tracing app

1 July 2020

Dr Michael Veale (果冻影院 Laws) explains the 鈥榙ecentralised鈥 approach of the Apple-Google contact-tracing system, the problems with the NHSX app, and why the infrastructural power given to big tech companies by contact-tracing tools should give us sleepless nights.

Dr Michael Veale

In April, Apple and Google announced a partnership. They would take research into how to undertake Bluetooth-powered Covid-19 contact tracing in a privacy-preserving manner, with no central database, and make it available as a toolkit inside their operating systems for public health authority鈥搒anctioned apps. Before they did this, all such apps had effectively been doomed to fail. At least on iPhones, they were crippled by the same baked-in Bluetooth restrictions that stop normal apps secretly tracking you.

The firms鈥 contact-tracing toolkit has been both praised and condemned. Its 鈥渄ecentralised鈥 approach, with no sensitive central database of who-saw-who, has been supported by hundreds of privacy, security and human rights scholars. The concerns are understandable. The history of passports 鈥 which were introduced as a seemingly temporary measure during the first world war, but were retained in response to fears about spreading the Spanish flu 鈥 shows that pandemics can significantly influence our social infrastructure. And so they should be designed to minimise future misuse.

Through a software update, Apple and Google loosened privacy restrictions enough to allow public health authorities to run decentralised contact-tracing apps, but did not engineer new functionality to let apps send the unique Bluetooth identities of phones they encountered to a central server. Data had to remain secretly on phones: which was not a problem for decentralised systems, but left centralised apps 鈥 such as those favoured by France and the tech wing of NHS England, NHSX 鈥 continuing to struggle to use Bluetooth.

Reasons for preferring centralised systems differed. NHSX wanted individuals to trigger self-isolation alerts based on self-reported symptoms, and said it needed centralised fraud analysis to weed out the inevitable hypochondriacs and trolls. The French minister for the digital sector, C茅dric O, said that self-reporting was a no-no, and instead wanted to use centralisation to try to lower the risk of a particular snooping attack from a tech-savvy neighbour. (This is a risk that can never be fully removed from any Bluetooth contact-tracing system.)

Tensions grew as it became clear that the firms did not intend to engineer a further global change to their operating systems to specifically accommodate these countries. In the French parliament, O stated that it was no coincidence that the UK and France were going against the grain, given that they were 鈥渢he only two European states with their own nuclear deterrent鈥. However, it is worth noting that no country, nuclear-armed or not, even attempted to use the first tool of a sovereign nation against the firms 鈥 the ability to make binding laws. Instead, they continued the bizarre path, seen in recent years from politicians around the world, of treating these firms like sovereign nations, hoping that they recognised each other鈥檚 legitimacy and that their 鈥渙fficials鈥 could come to some agreement.

They did not, and the saga of the demise of NHSX鈥檚 centralised app in a mid-June U-turn is well-documented. NHSX piloted an app relying on fragile workarounds to avoid the privacy restrictions built into operating systems, despite warnings from many outside the project 鈥 myself included 鈥 that it was likely to encounter problems. In June, the government admitted that its workarounds left its system unacceptably poor at detecting either iPhones or Androids at all.

What can we learn from NHSX鈥檚 encounter with these tech giants? One key lesson requires distinguishing the problem of privacy from that of platform power. It is possible to be strongly in favour of a decentralised approach, as I am (as a co-developer of the open-source DP-3T system that Apple and Google adapted), while being seriously concerned about the centralised control of computing infrastructure these firms have amassed.

It鈥檚 commonly said that in the digital world, data is power. This simple view might apply to a company collecting data through an app or a website, such as a supermarket, but doesn鈥檛 faithfully capture the source of power of the firms controlling the hardware and software platforms these apps and websites run on. Using privacy technologies, such as 鈥渇ederated鈥 or 鈥渆dge鈥 computing, Apple and Google can understand and intervene in the world, while truthfully saying they never saw anybody鈥檚 personal data.

Data is just a means to an end, and new, cryptographic tools are emerging that let those firms鈥 same potentially problematic ends be reached without privacy-invasive means. These tools give those controlling and co-ordinating millions or even billions of computers the monopolistic power to analyse or shape communities or countries, or even to change individual behaviour, such as to privately target ads based on their most sensitive data 鈥 without any single individual鈥檚 data leaving their phone. It鈥檚 not just ad targeting: privacy technologies could spotlight the roads where a protest is planned, the areas or industries likely to harbour undocumented migrants, or the spots in an oppressive country most likely to be illegal LGBT clubs 鈥 not personal data, but data with serious consequences nonetheless.

This approach is effectively what underpins the Apple-Google contact-tracing system. It鈥檚 great for individual privacy, but the kind of infrastructural power it enables should give us sleepless nights. Countries that expect to deal a mortal wound to tech giants by stopping them building data mountains are bulls charging at a red rag. In all the global crises, pandemics and social upheavals that may yet come, those in control of the computers, not those with the largest datasets, have the best visibility and the best 鈥 and perhaps the scariest 鈥 ability to change the world.

Law should be puncturing and distributing this power, and giving it to individuals, communities and, with appropriate and improved human-rights protections, to governments. To do so, we need new digital rights. Data protection and privacy laws are easily dodged or circumvented by technical assurances of confidentiality: we need something more ambitious to escape the giants鈥 walled gardens.

A 鈥渞ight to repair鈥 would stop planned obsolescence in phones, or firms buying up competitors just to cut them off from the cloud they need to run. A 鈥渞ight to interoperate鈥 would force systems from different providers, including online platforms, to talk to each other in real time, allowing people to leave a social network without leaving their friends. These interventions need strong accompanying oversight to maintain security and privacy, and stop unwanted side-effects or government abuse, such as the outlawing of end-to-end encryption to oppress dissidents and whistle-blowers. It all starts from realising that deflating digital power isn鈥檛 just about governing data: it鈥檚 the walls of the underlying systems we have to tear down.

This article was first published in the on 1 July.

Links