Apr 14, 2020
Living through a pandemic already feels like an apocalyptic movie; nobody wants it to feel like an episode of Black Mirror, too. Sadly, we may already be setting the stage for a slate of terrifying episodes about how technology can be abused during a global health crisis.
Being stuck at home during the outbreak makes all of us more dependent than ever on online purchases, digital services and delivery apps to meet our daily needs. But even as these services make our new socially distant lifestyle possible, our increased dependence comes with significant privacy risks.
Big Tech companies may have stopped some of the price gouging on their platforms, but nothing is stopping them from "data gouging," or collecting troves of new personal information on us that can be used to track and manipulate us in unprecedented ways. New shopping patterns, work-from-home tools and distance learning apps are harvesting reams of data about our families - data that can reveal a great deal about with whom we live, where our friends and family are at any moment and even our health status.
Existing laws and regulations offer few privacy protections for this data, which is collected as a matter of routine by tech companies and online vendors. We have needed a baseline federal privacy law for years, but Congress thus far has failed to pass one.
Already, the Trump administration and corporations are exploring ways to collect and process data en masse to address the unfolding public health and economic crises - whether it's through tracking our location, purchases or health information. The CARES Act, the third coronavirus relief package passed by Congress, allocated $500 million for a public health surveillance system. In addition, the White House has reached out to tech companies with access to huge troves of consumer data for help during the crisis.
In some circumstances, it may be necessary to track the location of individuals who test positive for COVID-19, but that data could be damaging to a person's finances, employment or housing prospects if shared with government agencies or businesses that have no role in public health. For example, a person's job prospects, ability to get loans or other financial products, or buy insurance may be compromised.
That's why new data collection in response to the pandemic, or the sharing and processing of data in novel ways, should come with new privacy protections - even in the absence of a baseline federal privacy law.
Congress is considering a fourth relief package right now. Lawmakers still have the chance to include measures aimed at stopping the abuses that are likely to result from data gouging linked directly or indirectly to the pandemic. Here are some general principles that would help curtail those abuses:
First, extraordinary public health measures involving data collection introduced during the crisis should be limited in scope and duration, so they do not become permanent features of law. Any mass data collected must be necessary to resolve the crisis and gathered for a limited time, and should not be used or repurposed for marketing, advertising or other commercial purposes - or any unrelated research purposes without informed consent.
Second, data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained. All newly collected or processed data must be kept confidential and secure - and should be deleted automatically following the pandemic.
And third, it is not enough to expect tech giants and other companies that harvest and sell our data to keep the promises they make in their unregulated terms of service. We must hold companies accountable for violating these principles or failing to keep our data secure. Penalties must be severe enough to outweigh the financial benefits of breaking the law, which may end up being considerable.
In March, 15 groups urged Congress to incorporate these ideas into legislation in the hopes of stopping the public health emergency and economic meltdown from metastasizing into a digital privacy disaster.
We do not have to become a data dystopia. But we must act quickly to stop the coronavirus from turning us into one.
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Living through a pandemic already feels like an apocalyptic movie; nobody wants it to feel like an episode of Black Mirror, too. Sadly, we may already be setting the stage for a slate of terrifying episodes about how technology can be abused during a global health crisis.
Being stuck at home during the outbreak makes all of us more dependent than ever on online purchases, digital services and delivery apps to meet our daily needs. But even as these services make our new socially distant lifestyle possible, our increased dependence comes with significant privacy risks.
Big Tech companies may have stopped some of the price gouging on their platforms, but nothing is stopping them from "data gouging," or collecting troves of new personal information on us that can be used to track and manipulate us in unprecedented ways. New shopping patterns, work-from-home tools and distance learning apps are harvesting reams of data about our families - data that can reveal a great deal about with whom we live, where our friends and family are at any moment and even our health status.
Existing laws and regulations offer few privacy protections for this data, which is collected as a matter of routine by tech companies and online vendors. We have needed a baseline federal privacy law for years, but Congress thus far has failed to pass one.
Already, the Trump administration and corporations are exploring ways to collect and process data en masse to address the unfolding public health and economic crises - whether it's through tracking our location, purchases or health information. The CARES Act, the third coronavirus relief package passed by Congress, allocated $500 million for a public health surveillance system. In addition, the White House has reached out to tech companies with access to huge troves of consumer data for help during the crisis.
In some circumstances, it may be necessary to track the location of individuals who test positive for COVID-19, but that data could be damaging to a person's finances, employment or housing prospects if shared with government agencies or businesses that have no role in public health. For example, a person's job prospects, ability to get loans or other financial products, or buy insurance may be compromised.
That's why new data collection in response to the pandemic, or the sharing and processing of data in novel ways, should come with new privacy protections - even in the absence of a baseline federal privacy law.
Congress is considering a fourth relief package right now. Lawmakers still have the chance to include measures aimed at stopping the abuses that are likely to result from data gouging linked directly or indirectly to the pandemic. Here are some general principles that would help curtail those abuses:
First, extraordinary public health measures involving data collection introduced during the crisis should be limited in scope and duration, so they do not become permanent features of law. Any mass data collected must be necessary to resolve the crisis and gathered for a limited time, and should not be used or repurposed for marketing, advertising or other commercial purposes - or any unrelated research purposes without informed consent.
Second, data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained. All newly collected or processed data must be kept confidential and secure - and should be deleted automatically following the pandemic.
And third, it is not enough to expect tech giants and other companies that harvest and sell our data to keep the promises they make in their unregulated terms of service. We must hold companies accountable for violating these principles or failing to keep our data secure. Penalties must be severe enough to outweigh the financial benefits of breaking the law, which may end up being considerable.
In March, 15 groups urged Congress to incorporate these ideas into legislation in the hopes of stopping the public health emergency and economic meltdown from metastasizing into a digital privacy disaster.
We do not have to become a data dystopia. But we must act quickly to stop the coronavirus from turning us into one.
Living through a pandemic already feels like an apocalyptic movie; nobody wants it to feel like an episode of Black Mirror, too. Sadly, we may already be setting the stage for a slate of terrifying episodes about how technology can be abused during a global health crisis.
Being stuck at home during the outbreak makes all of us more dependent than ever on online purchases, digital services and delivery apps to meet our daily needs. But even as these services make our new socially distant lifestyle possible, our increased dependence comes with significant privacy risks.
Big Tech companies may have stopped some of the price gouging on their platforms, but nothing is stopping them from "data gouging," or collecting troves of new personal information on us that can be used to track and manipulate us in unprecedented ways. New shopping patterns, work-from-home tools and distance learning apps are harvesting reams of data about our families - data that can reveal a great deal about with whom we live, where our friends and family are at any moment and even our health status.
Existing laws and regulations offer few privacy protections for this data, which is collected as a matter of routine by tech companies and online vendors. We have needed a baseline federal privacy law for years, but Congress thus far has failed to pass one.
Already, the Trump administration and corporations are exploring ways to collect and process data en masse to address the unfolding public health and economic crises - whether it's through tracking our location, purchases or health information. The CARES Act, the third coronavirus relief package passed by Congress, allocated $500 million for a public health surveillance system. In addition, the White House has reached out to tech companies with access to huge troves of consumer data for help during the crisis.
In some circumstances, it may be necessary to track the location of individuals who test positive for COVID-19, but that data could be damaging to a person's finances, employment or housing prospects if shared with government agencies or businesses that have no role in public health. For example, a person's job prospects, ability to get loans or other financial products, or buy insurance may be compromised.
That's why new data collection in response to the pandemic, or the sharing and processing of data in novel ways, should come with new privacy protections - even in the absence of a baseline federal privacy law.
Congress is considering a fourth relief package right now. Lawmakers still have the chance to include measures aimed at stopping the abuses that are likely to result from data gouging linked directly or indirectly to the pandemic. Here are some general principles that would help curtail those abuses:
First, extraordinary public health measures involving data collection introduced during the crisis should be limited in scope and duration, so they do not become permanent features of law. Any mass data collected must be necessary to resolve the crisis and gathered for a limited time, and should not be used or repurposed for marketing, advertising or other commercial purposes - or any unrelated research purposes without informed consent.
Second, data collection and processing must be transparent, and individuals should be clearly informed about the purpose of data collection and how long their data will be retained. All newly collected or processed data must be kept confidential and secure - and should be deleted automatically following the pandemic.
And third, it is not enough to expect tech giants and other companies that harvest and sell our data to keep the promises they make in their unregulated terms of service. We must hold companies accountable for violating these principles or failing to keep our data secure. Penalties must be severe enough to outweigh the financial benefits of breaking the law, which may end up being considerable.
In March, 15 groups urged Congress to incorporate these ideas into legislation in the hopes of stopping the public health emergency and economic meltdown from metastasizing into a digital privacy disaster.
We do not have to become a data dystopia. But we must act quickly to stop the coronavirus from turning us into one.
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.