What do you have to hide? That’s an issue raised by two comments about my post, Communication, Privacy, and Community in the New Normal.
One commenter asked, “What if the government or a private group knowing your real-time biometrics could save lives? Why do we hold the privacy of such data in such high regard?”
This post addresses these questions.
Privacy is not a simple either-or binary issue. It’s actually quite complex, dealing with legal rights, personal freedom, and political culture, among other things. It involves system design and safeguards to prevent abuse.
Our privacy laws and norms have evolved in response to a history of abuses. In general, Americans (and many others) believe that people are entitled to a presumption of privacy, subject to certain exceptions and procedures to protect against abuse.
We may take this for granted in our generally free society, but if we have lived in a society with intrusive surveillance, we would value our freedom and privacy much more.
In the context of the coronavirus crisis, the issue isn’t whether to collect any data at all, but rather what data to collect, how to collect it, and how to reduce the risks that it will be abused.
My post includes excerpts from an article by historian Yuval Noah Harari, The World After Coronavirus, which describes some abuses.
It is crucial to remember that anger, joy, boredom and love are biological phenomena just like fever and a cough. The same technology that identifies coughs could also identify laughs. If corporations and governments start harvesting our biometric data en masse, they can get to know us far better than we know ourselves, and they can then not just predict our feelings but also manipulate our feelings and sell us anything they want – be it a product or a politician.
. . .
You could, of course, make the case for biometric surveillance as a temporary measure taken during a state of emergency. It would go away once the emergency is over. But temporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon.
. . .
Even when infections from coronavirus are down to zero, some data-hungry governments could argue they needed to keep the biometric surveillance systems in place because they fear a second wave of coronavirus, or because there is a new Ebola strain evolving in central Africa, or because . . . you get the idea.
Asking people to choose between privacy and health is, in fact, the very root of the problem. Because this is a false choice. We can and should enjoy both privacy and health. We can choose to protect our health and stop the coronavirus epidemic not by instituting totalitarian surveillance regimes, but rather by empowering citizens. In recent weeks, some of the most successful efforts to contain the coronavirus epidemic were orchestrated by South Korea, Taiwan and Singapore. While these countries have made some use of tracking applications, they have relied far more on extensive testing, on honest reporting, and on the willing co-operation of a well-informed public.
My post also includes a link to an article, Coronavirus Tracking Apps Meet Resistance in Privacy-Conscious Europe. It describes systems designed to protect public health and privacy.
[I]n Europe – especially in Germany and Austria, where memories of authoritarian government excesses from the last century linger – many people have little desire to adopt the voluntary technology their governments have begun to promote.
. . .
The Austrian app uses the Bluetooth transmitter on users’ phones to monitor other phones that come near to them. It keeps that information on the phone. If a person later suspects he or she has come down with covid-19 or has received a formal diagnosis, that information can be uploaded from the app to alert others, anonymously, that they may have been exposed. If users want to stop being tracked, they can simply delete the app and the data. No central database exists.
Germany, France, the Netherlands and Britain are among those exploring apps that work along the same lines: opt-in rather than mandatory, with data kept anonymous, and no GPS information going to governments or telecom companies. The approach could appeal in the United States, where many citizens are likewise distrustful of government snooping.
. . .
“If you’re going to store all this stuff in the cloud somewhere, at some point, there will be some guy somewhere saying, ‘Oh, isn’t that interesting.’ If you have the key to all that data, they’ll find a new purpose,” said Sophie in ’t Veld, a Dutch member of the European Parliament . . ..
. . .
“Even if it’s done with the best of intentions, we need to be careful that these measures don’t become permanent,” said Diego Naranjo, the head of policy at European Digital Rights, an advocacy group that seeks to protect digital privacy.
When individual information is collected, privacy principles require that it is anonymized as much as reasonably possible, access is limited only to those with a legitimate need to perform an authorized purpose, and data is deleted as soon as it is no longer needed.
Protecting privacy may actually increase and improve data collection. When authorities respect privacy, people are more likely to voluntarily share information. Conversely, when authorities don’t protect privacy, people are more reluctant to do so.
Problems with the “What Do You Have to Hide?” Standard
The “what do you have to hide?” standard is very problematic – especially for legal and other dispute resolution professionals. Protection of clients’ privacy is a fundamental principle of our work. It enables clients to talk freely and carefully consider their options.
Privacy protections are embedded in many parts of our law – and they aren’t limited to situations where people have something to hide.
For example, the Fourth Amendment doesn’t limit protection against unreasonable searches and seizures only for innocent people who do not pose threats to others. Lawyer-client privilege and mediation confidentiality rules aren’t limited to innocuous communications. Educational privacy laws don’t prevent disclosure of grades only for students with 4.0 GPAs.
The point is that governments and private interests are required to protect privacy for everyone – unless and until they can justify doing otherwise.
We all have some things to hide. But it’s not even a matter of whether something would be embarrassing or threaten tangible interests.
It’s a matter of personal autonomy. Individuals value the power to decide what information of theirs to share, with whom, and under what circumstances etc.
Beyond individual autonomy, privacy promotes open societies, where people can live freely without worrying that others will use their information in ways they don’t want.
When individuals make choices about sharing their own information, “what do I have to hide?” is a legitimate decision standard.
When making policy, it is not.