Tracing app is no cure-all

Submitted by AWL on 5 May, 2020 - 7:47 Author: Ben Tausz
Tracing app

The print and pdf versions have this article abridged


As governments and tech companies tout digital contact tracing as a way out of coronavirus lockdown, many experts and privacy campaigners are expressing scepticism. They worry these systems may do little to help contain outbreaks but could usher in unprecedented mass surveillance. So, what are the issues?

First, what is digital contact tracing? Manual contact tracing is an age-old public health measure. A patient newly diagnosed with an infectious disease sits down with an expert interviewer, who helps them retrace their steps for the time they may have been infectious and identify encounters in which other people may have been infected. Health workers then contact those people and isolate them to prevent further spreading.

This is fine for a disease which requires close personal contact to transmit, so that the patient is likely to know most of the potential infected. But it hits limits in an epidemic that spreads via aerosol droplets. Could you name everyone you sat next to on public transport across two weeks? Can the NHS deploy enough interviewers, quickly enough, to contain a raging pandemic?

Enter digital contact tracing. The forms proposed in most European countries are based around building smartphone apps that use Bluetooth and anonymous IDs to ping other nearby smartphones carrying the same app, to build a log of close encounters. When a user reports they’re infected, the log shows recent encounters that could have posed an infection risk. Anyone who might have been infected can be alerted via their app, and given advice (“self-isolate”, “order a test”, “contact your doctor” etc) before they unwittingly spread it further.

The technology is new, and it is difficult to separately assess the effectiveness of individual components of a public health response – meaning that there is little evidence about how much good it does on top of existing methods.

Digital contact tracing must complement, not replace, investment in other measures

What we can say with some certainty is that digital contact tracing is not a silver bullet. At best it might complement other measures. The product lead for Singapore’s app says their experience suggests “contact tracing should remain a human-fronted process”.

To be effective, a large percentage – perhaps a majority - of the population needs to take up the app. In Singapore only 20% have.

According to Ofcom 22% of UK adults don’t have a smartphone, so couldn’t participate. That’s even higher for poorer people and older people (groups who are already more vulnerable if infected) and younger children (concerning if nurseries and primary schools, where social distancing is hard to maintain, turn out to be key sites of transmission).

So not only are large numbers of people left out, they aren’t spread evenly through the population but are clustered, more vulnerable, and more likely to interact with each other – compounding the problem.

Since not everyone could use an app, it can’t be compulsory. Consider too the police violence and harassment that would inevitably result from any attempted enforcement – and the virus transmission risked by excessive police contact.

Non-coercive, non-exploitative strategies to encourage as much uptake as possible, as evenly as possible, are therefore needed. And these strategies must consider how to protect people from being scammed into downloading any malicious software that falsely claims to be the official app.

Lack of tests weaken contact tracing

With the government’s planned testing levels, the app would largely rely on individuals self-assessing their symptoms to report infection.

This is inevitably less accurate, and it is also open to malicious reporting. Trolls could use it for mass disruption, or an abuser might report themselves sick to get their victim quarantined.

And most encounters with a genuinely infected person don’t result in infection. If alerted users don’t have quick access to accurate tests, they will have to self-isolate even if it turns out to be unnecessary.

If users receiving alerts perceive them as unreliable or oversensitive, they may be less likely to follow the instructions. So, contact tracing’s effectiveness might depend on NHS testing capacity.

Privacy by design

Especially given the rise of hard-right authoritarianism, we can’t trust the state to regulate its own use of intrusive technology. And anything stored or processed centrally is vulnerable to hacking, especially if development is rushed.

So, privacy must be built into the system’s bones. As little data as possible must pass through the hands of the authorities or businesses.

First, that means only collecting necessary data: the “encounters” each device has had, their duration and proximity. No personal or location data.

Second, as little as possible should be stored centrally. Campaigners, backed by even Apple and Google, favour a decentralised system. The log of encounters stays on the user’s device unless the user tells the app they are infected. Then it uploads its log to a central list.

Each other user’s device regularly checks that list to see if one of its (anonymised and constantly rotating) IDs appears. The app on the device estimates the risk that its owner has been infected – based on number, proximity, and duration of encounters with other infected users – and alerts its owner if they need to act.

But some governments, including the UK, want a centralised system. All our encounters would be on a central database. They talk about anonymised IDs, but with that much information it wouldn’t be hard for the state – or a hacker! – to identify individuals. The German government wanted centralisation but recently relented under pressure – we must push our government similarly.

Third, the system must constantly delete its records. It should keep only recent encounters and reports, within the virus incubation period.

Fourth, the system must be separate from any lockdown or quarantine enforcement. Users could be given the option of sharing information with the NHS or their GP when they get an alert, but this should require additional consent.

We must guard against anything like China’s system, where an app’s opaque risk-scoring of its user acts as a passport to certain rights, and the state might easily manipulate scores to lock down dissidents and persecuted minorities.

This is also vital to maximise uptake. If people feel the app is too intrusive, they won’t install it or obey its guidance, rendering it useless. Indeed, many campaigners and experts would rightly warn against participation.

People with more reason to fear surveillance, or with experience of ill-treatment by the state, might be less likely to use an intrusive app. For example, migrants with uncertain or changing status, or ethnic minorities who already experience more state harassment and violence. Precisely the people on the sharp end of inequalities that already make them more vulnerable if infected.

As trust in health authorities fell and suspicion grew, a knock-on effect might even reduce compliance with other public health measures – letting the virus surge back.

Full isolation pay and benefits

Finally, an app telling people to self-isolate is useless if those people must choose between following the advice or putting food on the table. Those advised to self-isolate need full pay or benefits to get them through.

Scrutiny and caution

Given the problems, we need caution about introducing digital contact tracing. But the tendency to present novel technologies as magic quick fixes could foster a false sense of security. Governments might rush through an app, then use the perceived protection to push us back into workplaces too early or without other safeguards.

We must demand that the government only introduces digital contact tracing if it can transparently present a full plan, and public and expert scrutiny confirms these plans adequately address the limitations. And if it goes ahead, it must protect privacy and cannot be at the expense of investment in other measures.

If a system that doesn’t meet these criteria is introduced, it could be useless or even damaging in the fight against the virus. And it could bring state surveillance to insidious and lasting new levels. The stakes are high.

This website uses cookies, you can find out more and set your preferences here.
By continuing to use this website, you agree to our Privacy Policy and Terms & Conditions.