Contact tracing apps highlight the importance of combining both technical privacy fixes and a grounding in the international human rights framework.
Tracking app for COVID-19 on a smartphone. Photo by Thomas Trutschel/Photothek via Getty Images.
There is a legitimate fear that the use of contact tracing apps and other technologies to slow the spread of COVID-19 could tip the balance in the fight for privacy in the digital age, presenting a major challenge for policymakers, technology companies, and the human rights community.
The key issues are what level of intrusion into privacy is acceptable for rapidly developed data-driven responses to the COVID crisis, and who decides? In the short term, technical fixes are key to limiting the risks, but legal and other safeguards remain critical during the current crisis and beyond.
Under international law, such as Article 17 of the International Covenant on Civil and Political Rights and relevant regional treaties, interference with the right to privacy can only take place in limited circumstances where there is a legitimate aim.
Importantly, this may include some measures that aim to protect public health. But governments must ensure that any interference with privacy, such as collecting personal data on a government-authorized app, is actually justified and not arbitrary.
As a general rule, the greater the intrusion into people’s private lives and other potential human rights impacts when doing so, the higher the burden on states to establish (and continue to prove) they are necessary. Emergency measures are expected to be time-bound.
If there is wide use of smartphones in a country, contact tracing apps may be seen as less restrictive than lockdown – a legitimate factor in deciding whether to adopt an app. However, apps can also present an unacceptable intrusion into private lives if they collect personal data but don’t deliver a clear public health benefit. Apps may speed up the process of tracing who has the virus and who it spreads to, but high levels of testing, traditional contact tracing, advice and support are also needed.
Even where public health interest can be shown, if there are significant risks that data will be abused or other data security risks, then the interference may not be proportionate. This has prompted efforts to design apps to reduce privacy and security risks, such as minimizing what data is collected (for example offering options to collect non-location data), and anonymizing and encrypting data.
There are also efforts to de-centralize contact tracing apps. Technology offered by tech giants Apple and Google deploys Low-Energy Bluetooth capacity, which can support how apps capture anonymized contact points function, saving the energy needed for the app to ‘run’ in the background.
Under this model, any anonymized data exchanged between phones to identify contacts stays at the phone level so limited information is transferred to centralized government databases. Apple and Google also indicate their terms of service require that apps are voluntary, are not used for non-coronavirus goals, and that functionality will be removed on a regional basis when no longer needed.
The conditions for using Apple and Google’s proposed functions appear laudable. Tech platforms have a responsibility to respect human rights in their operations and, under pressure from civil society, are increasingly mindful of privacy and data protection concerns; they also have experience of managing vastly different regulatory and policy approaches in countries.
Voluntary apps can also become mandatory by default if, for example, transport providers or employers require their use. If individuals, such as migrants and others entering a country, feel they have no alternative then, as with cookies on websites, they may simply accept the terms offered.
A de-centralized app may still rely on ‘risk scores’, which often uses artificial intelligence and requires careful scrutiny. If scores are overly broad or rely on symptoms reported by nearby contacts only (again where there is limited testing), they can easily become arbitrary or even discriminatory, especially if linked to advice about self-isolating or enforced quarantine measures.
Strong democratic and legal safeguards are critical. Early efforts include the Australian law which makes it unlawful for anyone – including government officials – to use data from the government- approved app for non-coronavirus contact tracing reasons, and for employers to demand their staff use the app.
This law also re-affirms the role of independent regulators, and prevents unauthorised disclosure of data outside of Australia, reflecting community concerns that a foreign company – in this case US-based Amazon Web Services – will be involved in the app’s administration.
However, Australian privacy experts have suggestions for improvement for other countries looking at similar legislation, and outsourcing to a US-based firm is expected to speed up a US-Australia agreement on cross-border data access for law enforcement, the terms of which could be influenced by the current pandemic.
Globally, civil society and academic initiatives, including collaborative efforts have provided a range of dos and don’ts, such as the need for sunset clauses and oversight bodies that include community voices about acceptable terms with ongoing monitoring and evaluation.
A big test may be how apps are used when international travel resumes. To support travel (especially to avoid quarantine, as the EU hopes) means states agreeing on privacy as well as technical standards. Open and robust conversations on the requirements of the human rights framework for governments and others will be essential.
All measures introduced to combat COVID-19 are expected to be time-bound. So, when the time comes, if removing apps proves a relatively straightforward decision for Apple, Google and governments it will be a clear sign that we are either globally on top of the coronavirus, or that the apps have not been able to deliver public health value.
But, if along the way these apps mutate to gather more data, are argued as a must-have in case of future pandemics, or the attraction of population-wide data for other reasons proves too tempting, then getting out of these apps could require more multi-stakeholder cooperation than the decision to take them on in the first place.
Source: International Law and Governance