Pandemics and privacy
COVID-19, contact tracing apps and protecting our privacy
Over the next few months, the world is going to come out of lockdown. The Office for Budget Responsibility’s coronavirus analysis suggests that there will be a profound shrinking of the economy during the period of lockdown – GDP falling by 35% – but that it will bounce back quickly. However, that relies on the lockdown being relatively short, no more than three months. A longer lockdown would start to have profound and long-lasting impacts to the real economy.
It will therefore be necessary to find other measures to allow the economy to restart. A vaccine would be ideal, but one is likely to be many months or even years away, as even the most promising candidates have not yet reached Phase I testing.
The likely outcome therefore will be continued, but less stringent, social distancing guidelines, together with increased testing – both for active virus, and for antibodies that show someone has had the disease in the past. Perhaps key to the success of that will be “contact tracing” – identifying who an infected person has been in contact with over the period that they were likely to have been infectious, to minimise the likelihood that they will spread the disease further.
One hope is that contact tracing apps can provide a way of tracking in close to real time, and allow lockdown to be eased while keeping the outbreak under control.
Contact tracing is a central plank of infectious disease epidemiology. In the early years of the HIV outbreak, investigators were able to learn a great deal about the then mysterious disease because Gaetan Dugas, an airline steward who died of AIDS in 1984, provided them with extensive information about his recent sexual partners.
But with respiratory illnesses such as COVID-19, contact tracing is more difficult – it is relatively easy to identify all the people with whom someone has had sexual contact, but it is much harder to identify all the people who have been in proximity with them generally.
One hope is that contact tracing apps – applications which note when smartphone users are close to one another, and can inform people if they have been in close contact with an infected or at-risk person – can provide a way of doing this sort of tracking in close to real time, and allow lockdown to be eased while keeping the outbreak under control. Modelling suggests that they can be effective at reducing the spread of the disease, although they will need to be widely adopted by the publicy.
However, contact tracing apps will need to take and keep some data from their users. This has significant privacy implications, especially as some users will necessarily be ill and at risk. This piece will discuss ways of minimising those privacy implications.
Dr Yves-Alexandre de Montjoye, of Imperial College London’s Data Science Institute, says that it is important to realise that improving privacy does not mean losing effectiveness, and vice versa. “I understand people who say that, in a public health crisis, we should curb privacy laws,” he says. “But I’ve been working with location and Bluetooth data since 2008, and from a technical perspective, we don’t have to.”
Location data or Bluetooth data?
Contact tracing apps work in one of two main ways. The first by using location data; they track where users have been over time. If someone is found to be infected, then everyone whose phone shows them to have been physically close to them will be alerted that they ought to take whatever precautions are advised.
The second is using Bluetooth. Instead of relying on where the phone is, a Bluetooth app will track which other phones it comes near. When another phone with Bluetooth switched on and the contact-tracing app enabled comes nearby, the two will connect, and a record will be kept of which phones have been connected. If a person is found to be infected, then everyone whose phone has connected with theirs will be alerted and told to take precautions.
Dr de Montjoye says that both location data and Bluetooth data are sensitive, although – rightly or wrongly – Bluetooth is usually seen as less so. Certainly, he says, it is more precise. “Every five minutes it will tell me if I was a meter, half a meter, two meters away from your phone,” he says. “So from a tech perspective, most people focus on that.”
The key questions
All app developers say they take privacy seriously, Dr de Montjoye says. “Everyone realises, at a high level, that there are privacy questions. But it’s unclear what is being collected, what is being protected and how it’s being done.”
The key thing, he says, is that users should not be in a position where they have to trust anyone – whether it’s the app’s developers, or the authorities. “In the UK, we have the rule of law, and most people trust the NHS and the government,” says de Montjoye. “But the goal is to design something so that we don’t have to rely on this trust: so we know that even if someone malicious had access to the system, if there’s a data leak or someone hacks into the server, that they wouldn’t be able to do very much with it.”
“The nightmare scenario,” he says, “would have been an app that records your location and uploads everything to a central server, so your real-time location and every person you met is linked to your ID card or phone number.
"Thanks to early warnings from the privacy community, GDPR, and Google and Apple's rapid work, I'm hopeful that we have avoided this, at least in Europe."
It’s not enough, he says, to say that you don’t collect location data – if your Bluetooth connections reveal who you’ve been in contact with, that is sensitive.
In a thought experiment using real Bluetooth data from Denmark, de Montjoye and colleagues looked at how very few people recording their location along with close proximity data could put everyone at risk. The question was: what percentage of the students did we have to know the exact location of in order to know the location of other students?
"The result was striking. Using a small percentage of students who also shared their location, we were able to find out the location of the rest of the students.
"If we extrapolate that result to a large, dense city like London: a rogue app that leaks the location of one percent of the population can, together with the 'proximity data', map the location of half of the Londoners."
What you need to do, he says, is design the app so that – as far as possible – the privacy risks are technically contained.
In a new white paper, he has proposed eight questions which policymakers should ask of developers before any contact tracing apps are put forward for use. These questions, ideally, would be asked by MPs – perhaps in select committee hearings – and would allow them to get beyond those high-level, generic questions and get a more specific understanding of what app developers are doing to protect users’ data.
He stresses that this is not about not trusting the NHS, or the government, or thinking that they are going to misuse the data. It is about ensuring that the data cannot be accidentally leaked, or hacked. And, further, he says that the UK needs to take a leadership role: if the NHS and the UK government do not take pains to protect the privacy of their citizens, it will be harder to suggest other countries should.
Privacy by design
His colleague Esther Rodriguez-Villegas, professor of low-power electronics at Imperial’s Department of Electrical and Electronic Engineering, says that while right now people are willing to put their trust in the NHS and government, that may change as the pandemic moves out of its most dangerous phase.
“We’re at the peak of the crisis now,” she says. “But when the situation doesn’t feel so dangerous, will people still trust it? It’ll only work if people are using it at the valley of the epidemic; at that point we’ll have a population out of lockdown, and if they don’t trust it they won’t use it.”
One way to increase trust, she says, is to make sure it’s not a government-built app. “There is a risk that certain governments will use it for things they’re not saying,” she says.
Her own app is intended for use around the world, not just in the UK: “Our government I hope wouldn’t do that,” she says. “But Apple and Android phones are everywhere in the world, and Apple is very reluctant to support government driven apps that don’t follow strict privacy design principles ”
In France, Apple are refusing to support a government-backed app for exactly that reason. “The government can facilitate a third-party app – if they have a code confirming a diagnosis from the government, that would be very helpful,” she says. But – in other countries at least, if not Britain – it may undermine public trust if the government is building the app itself.
She adds that privacy must be built in from the ground up. “The idea is that it has to be private by design.” She notes that one widely used app, the King’s College London symptom tracker, asked for personal information: an email address, some basic demographic details. “If you’re not going to use personal information, it’s better not asking for it,” she says.
Her team is creating a contact tracing app “that won’t request or record any personal info”, she says.
The questions raised by Dr de Montjoye are not simply academic. There are several apps or systems that are either already in action in some countries, and several of them appear problematic. Some are compulsory – it is required by national law to have them on your phone – while others are voluntary.
For instance, Singapore uses TraceTogether, which uses Bluetooth’s “Relative Signal Strength Indicator” (RSSI) readings to establish how close your phone is to other people’s. It stores that data for 21 days; if someone is found to be infected, the Singaporean Ministry of Health will use the data to contact anyone who also has the app and who was in close contact in the last 14 days. It is voluntary.
China’s HealthCode app is technically voluntary as well, but it is required if you wish to move freely around the country. It is not clear what data it collects; it simply gives users a rating, red, orange or green, depending on how at-risk they are deemed to be by its algorithm. Its lack of transparency is concerning.
Israel’s Hamagen uses location data to track movements; this data appears to be stored centrally.
And South Korea, which does not have an app, instead sends alerts to everyone in an area where a person is known to be infected, including the age and sex of the person and their location and recent movement. It risks letting people know who might have infected them, with the dangers of revenge attacks discussed above.
Professor Rodriguez-Villegas’s forthcoming app will use Bluetooth data. But instead of storing that data centrally, each phone transmits a unique identifier code to every other phone it meets. The phone’s identifier changes regularly and randomly. When a person is found to be infected, they can choose to upload that information to the app; it broadcasts all its former codes to the central authority. Then every phone which recognises one of those old codes will know it has been in proximity to an infected person – but, because the phone has now changed its code, it will be impossible for either the app or other users to identify them. Her team is also designing the app so that it can be used by people who have never used a smartphone.
In the UK, NHSX, a new unit driving forward the digital transformation of health and social care, will lead the development of a new contact-tracing app, while Google and Apple have worked together to produce a decentralised system.
The Google/Apple system uses Bluetooth data. It appears to run on a similar model to Professor Rodriguez-Villegas’s team’s app, with frequently updated identifiers.
So far the protocols behind NHSX’s proposed app are unknown, although it appears that they will use a central database. That may bring them into conflict with protocols being developed by Google and Android: the two tech giants are collaborating on a way of avoiding interoperability problems, and encouraging health services to build decentralised systems.
People often assume that the way to achieve security is to keep your methods secret – but security experts say the opposite.
You don’t keep your data safe by using some secret code; you do it by publishing your work so that others can see that it has no holes. The same is true of privacy.
Privacy is a mindset
“What I try to tell people,” says de Montjoye, “is that privacy is like security.” It requires openness, so that people can test it.
People who don’t work in cryptography often assume that the way to achieve security is to keep your methods secret – but security experts say the opposite. The cryptographer Bruce Schneier argues that “transparency and accountability don’t hurt security; they’re crucial to it”. You don’t keep your data safe by using some secret code; you do it by publishing your work so that others can see that it has no holes.
The same is true of privacy. “There’s a long history in privacy of people thinking they’ve done something smart, someone thinking they came up with something unique that no one will think about and break, and then the data is collected and someone re-identifies,” says de Montjoye. “Good privacy, like good security, relies on public, published protocols and code.”
Over the next several months, it may be that we rely heavily on contact-tracing apps to allow us some measure of freedom again; they could be a vital tool in the fight against COVID-19. But in order for them to work, they will need to be widely used; to be widely used they will need to be trusted; and for them to be trusted, they have to keep our data safe. Building good, transparent systems is not just good practice, it could save lives. “And it’s absolutely doable,” de Montjoye says.
The Forum is Imperial’s policy engagement programme. It connects Imperial researchers with policy makers to discover new thinking on global challenges. Our features provide a shop window into the world leading research taking place at Imperial and provide insight into how it can inform and contribute to public policy debates.