Published on
by

Even During a Pandemic, Digital Privacy Matters

We do not have to become a data dystopia. But we must act quickly to stop the coronavirus from turning us into one.

Data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained.(Photo: Blogtrepreneur/flickr/cc)

Data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained.(Photo: Blogtrepreneur/flickr/cc)

Living through a pandemic already feels like an apocalyptic movie; nobody wants it to feel like an episode of Black Mirror, too. Sadly, we may already be setting the stage for a slate of terrifying episodes about how technology can be abused during a global health crisis. 

Being stuck at home during the outbreak makes all of us more dependent than ever on online purchases, digital services and delivery apps to meet our daily needs. But even as these services make our new socially distant lifestyle possible, our increased dependence comes with significant privacy risks. 

Big Tech companies may have stopped some of the price gouging on their platforms, but nothing is stopping them from “data gouging,” or collecting troves of new personal information on us that can be used to track and manipulate us in unprecedented ways. New shopping patterns, work-from-home tools and distance learning apps are harvesting reams of data about our families – data that can reveal a great deal about with whom we live, where our friends and family are at any moment and even our health status.

Existing laws and regulations offer few privacy protections for this data, which is collected as a matter of routine by tech companies and online vendors. We have needed a baseline federal privacy law for years, but Congress thus far has failed to pass one.

Already, the Trump administration and corporations are exploring ways to collect and process data en masse to address the unfolding public health and economic crises – whether it’s through tracking our location, purchases or health information. The CARES Act, the third coronavirus relief package passed by Congress, allocated $500 million for a public health surveillance system. In addition, the White House has reached out to tech companies with access to huge troves of consumer data for help during the crisis.

In some circumstances, it may be necessary to track the location of individuals who test positive for COVID-19, but that data could be damaging to a person’s finances, employment or housing prospects if shared with government agencies or businesses that have no role in public health. For example, a person’s job prospects, ability to get loans or other financial products, or buy insurance may be compromised. 

That’s why new data collection in response to the pandemic, or the sharing and processing of data in novel ways, should come with new privacy protections – even in the absence of a baseline federal privacy law.

Congress is considering a fourth relief package right now. Lawmakers still have the chance to include measures aimed at stopping the abuses that are likely to result from data gouging linked directly or indirectly to the pandemic. Here are some general principles that would help curtail those abuses:

First, extraordinary public health measures involving data collection introduced during the crisis should be limited in scope and duration, so they do not become permanent features of law. Any mass data collected must be necessary to resolve the crisis and gathered for a limited time, and should not be used or repurposed for marketing, advertising or other commercial purposes – or any unrelated research purposes without informed consent.

Second, data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained. All newly collected or processed data must be kept confidential and secure – and should be deleted automatically following the pandemic.

And third, it is not enough to expect tech giants and other companies that harvest and sell our data to keep the promises they make in their unregulated terms of service. We must hold companies accountable for violating these principles or failing to keep our data secure. Penalties must be severe enough to outweigh the financial benefits of breaking the law, which may end up being considerable.

In March, 15 groups urged Congress to incorporate these ideas into legislation in the hopes of stopping the public health emergency and economic meltdown from metastasizing into a digital privacy disaster.

We do not have to become a data dystopia. But we must act quickly to stop the coronavirus from turning us into one.

Emily Peterson-Cassin

Emily Peterson-Cassin is the digital rights advocate for Public Citizen.

This is the world we live in. This is the world we cover.

Because of people like you, another world is possible. There are many battles to be won, but we will battle them together—all of us. Common Dreams is not your normal news site. We don't survive on clicks. We don't want advertising dollars. We want the world to be a better place. But we can't do it alone. It doesn't work that way. We need you. If you can help today—because every gift of every size matters—please do. Without Your Support We Simply Don't Exist.

Please select a donation method:



Share This Article