Every major disaster in history has always lead to calls for great surveillance. The false relation between public security and personal privacy leads governments to call for increased scrutiny into the daily lives of the masses.
The important questions that help us consider whether the steps undertaken go above and beyond the need of the hour, whether they are truly effective, are being ignored.
Today, I intend to tackle the three main questions recommended by the EFF Electronic Frontier Foundation) that MUST be asked while considering private and public response measures to the pandemic.
1.Would the proposal work?
In all cases, the burden of proof should fall on the authority proposing a measure. In a lot of cases, projects have been undertaken without a clear understanding to the public how and why they would be useful. While it is understandable that there is little time to conduct extensive research, there must be clear communication about the problem being solved. For example, while the calls to carry out location tracking of citizens to track the spread of the virus are common, there is little evidence to suggest that this granular data (accurate in the magnitude of many meters) could help meaningfully establish contact points.
2.Would it excessively intrude on our safeguards?
Measures undertaken must be proportional to the risk at hand. Theoretically, a company could launch a system to read a user’s every message to determine if they reveal that they are infected by the virus – and then alert the government. However, this is not proportional to the issue at hand – and is an unforgivable breach of privacy. An example of a proportional measure is the storing the names and contact details of international travellers during the time of a pandemic. This targets a very specific, heightened risk of transmission, and has a clear public benefit.
3.And are their sufficient safeguards?
There are some clear safeguards to be established when it comes to pandemic measures. The EFF lists 10, but a few key points stand out.
- Consent: the user must specifically “opt-in” to such a program
- Anti-bias: many major algorithms intentionally or accidentally display bias as described in my fellow student’s article. All such measures must be explicitly tested for this weakness.
- Expiration: the most important. Powers gained during the pandemic should not be treated as permanent. After the crisis, all data collection must stop. All data collected should either be deleted or minimised to explicitly health-related concerns after the pandemic.
Technology can massively help prevent the pandemic. It is up to us to decide whether we want to create what Edward Snowden calls an “architecture for oppression” or what I envision – “the operating system of humanity”.
By OMOTEC Student – Advait Sangle