SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Facebook is facing a political and regulatory siege on every conceivable front. The Federal Trade Commission (FTC) and 46 states are challenging the company's acquisitions of Instagram and WhatsApp--with divestiture being the sought-after remedy. The company's global head of safety testified to Congress in September to explain the company's recent efforts to attract more children to its digital properties. Merely a week later, whistleblower Frances Haugen proved to be a far more compelling witness and revealed the true extent of Facebook's knowledge of the harmful effects its products have on children and its fervent desire to collect data and extend its active user base to this "valuable but untapped audience." All these events also take place against a backdrop of the most significant congressional antitrust investigation in decades, five proposed antitrust bills in the House of Representatives seeking to deconcentrate the technology sector, and other repugnant acts the company has committed over the past decade. News scandals detailing Facebook's actions appear as an almost daily occurrence.
The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms.
And yet, despite warning letters from administrative agencies and Congress and multiple bouts of deception and misrepresentations to investors and public institutions, it's mostly business as usual at Facebook. Facebook continues to make vacuous or lackluster promises to reform its business model to align it with publicly acceptable market practices. But Facebook has repeatedly proven it is unwilling and incapable of addressing the problems manifested from its business model. The new revelations from the Facebook Files released by the Wall Street Journal in October show that Facebook consistently and deliberately sought to keep users engaged and placed its own economic interests over the safety of its users.
In no way was this crisis with Facebook inevitable. The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms. Nevertheless, in addition to vigorously pursuing its antitrust case against the corporation, the FTC also has vast congressionally delegated powers that it can use to prohibit specific corporate conduct and force Facebook to use more publicly acceptable business practices. Now is the time to fully use these powers.
Facebook Is the Result of Repeated and Deliberate Policy Failures
At its core, Facebook provides a digital space for users to share information, which the corporation collects and archives so that it can then refine sophisticated and black box algorithms to push digital advertisements and content to its unmatched global user base. The bedrock on which Facebook's services sit atop is the rules regarding user privacy, which is deeply connected to the data the corporation collects. Data and privacy share an intimate relationship because they are diametrically opposed: More privacy means less data to acquire and subsequently monetize.
Privacy has always been central to much of the scrutiny surrounding Facebook. With the release of its Beacon service in 2007, Facebook decided to make nearly every action its users took on the internet public on their newsfeed. The event sparked immediate outrage and was encapsulated by questions surrounding the methods that Facebook should be allowed to use to collect user data. In 2011, the FTC filed a complaint against Facebook for unfair and deceptive practices regarding its privacy controls. The company settled with the FTC and agreed to supposedly tight restrictions on how it could change its privacy policies. Much of the settlement involved Facebook implementing a privacy program to determine how to protect user privacy, obtaining users' affirmative consent to share data, and providing certified privacy audits to the FTC for 20 years. Despite the existence of the 2011 settlement and the concern surrounding user privacy, the FTC put almost no effort into reviewing Facebook's acquisition of Instagram in 2012 or WhatsApp in 2014. Both companies are now critical components of Facebook's business empire.
The Cambridge Analytica scandal in 2018 was one of the first major events that revealed the extent to which Facebook allowed third parties to use its vast repositories of user data and what users with nefarious motives can do with it. In the aftermath of the scandal, the FTC used the 2011 settlement to impose even more restrictions on Facebook's privacy policies. Some restrictions included the creation of an independent board to review decisions concerning user privacy and additional compliance requirements. Furthermore, using the 2011 settlement, the FTC fined the company $5 billion--a paltry sum compared to the $70 billion in revenue the company generated that year.
What is clear from these events is that despite the seriousness of Facebook's conduct, regulators took no action to challenge Facebook's underlying business model. But Facebook, like all companies, is a product of the legal paradigm it operates within, which shapes and incentivizes all business strategies. As evidenced by the company's repeated confrontations with federal agencies and Congress that resulted in no substantial action, its existence was and still is a political choice. It is now unmistakably apparent that the public should no longer tolerate Facebook as a corporate entity or those companies that emulate its business practices.
While the public waits for much-needed congressional action, the FTC has broad congressionally delegated antitrust powers to remedy the current crisis with Facebook and enact robust privacy protections for billions of users that can alleviate other societal ills resulting from similar internet services. In effect, because of Facebook's dominance, imposing new regulations on the company is really about altering our relationship with the internet and directly confronts the decades-long reign of corporate decisions and policy failures that have led us to this point.
The FTC Can Take Vigorous Action
When Congress created the FTC in 1914, it deliberately gave the agency the power to stop "unfair methods of competition" and in 1938 "unfair or deceptive acts or practices." These phrases were deliberately designed to be broad to ensure that Congress created an agency capable of watching over a constantly changing marketplace and quickly be able to prohibit practices that violated the ever-changing notion of publicly acceptable business conduct to ensure fair competition between corporate entities. As such, Congress gave and controlling judicial precedent affirms that the FTC possesses potent rulemaking powers that can be used to prohibit industry wide practices and "consider[] public values beyond simply those enshrined in the letter or encompass[ed] in the spirit of the antitrust laws."
Vigorously pursuing the antitrust case against Facebook must be one of the FTC's top priorities. At the same time, however, the FTC must use its broad rulemaking powers to ban the business practices that allow the harms Facebook creates to proliferate. Since data is the bedrock on which Facebook has built its digital empire, data usage and acquisition are what the FTC should target with new regulations. Here are a few policies the FTC could enact.
Policy #1: The FTC can limit what data can be collected.
Multiple surveys continue to confirm that users want significantly more privacy protections. But since companies like Facebook know that users and advertisers are incapable of leaving their platforms--combined with no serious threat of structural change or deterrence from the law being enforced--they have little reason to give customers the privacy protections they want, since it would essentially destroy the primary means by which their revenue is generated.
To properly remedy this situation and the past regulatory failures, rather than have Facebook or any other corporation determine what data is proper to collect from users, the FTC can establish clear rules. For example, the FTC can use its rulemaking powers to prohibit the collection of data on a user's race, sexual orientation, gender, or national origin. Such variables have been routinely exploited and used to push misinformation as well as other racist and hateful content to users. In one particularly egregious example, Facebook allowed housing marketers to target their ads based on race and religion, in violation of federal fair housing laws. The company was eventually sued by the Department of Housing and Urban Development over the same practices, more than two years after they first came to light.
Policy #2: The FTC can prohibit how companies share collected data.
Before a user is allowed to use an internet service, they are presented with an unreadable and overly lengthy terms of service. Buried in these non-negotiable contracts are terms that detail how a company can collect a user's data and what they can use it for. Companies exploit users by merely having them click "accept" without fully understanding what they agree to. If users don't agree to these terms, they cannot use the service.
Traditional economists and many legal scholars see these agreements as being fair, bargained for agreements that judges should view with a high amount of deference and a presumption of legality. Events like the Cambridge Analytica scandal exposed the extent to which these agreements create unknowable and unforeseen problems for users and why strict regulations are needed. In 2014, Facebook allowed third parties to use its vast repositories of user data by creating applications that connected with Facebook. This policy was embedded deep within Facebook's terms of service. Cambridge Analytica subsequently exploited this policy by creating a survey integrated with Facebook's login service. Users who logged into the service not only gave Cambridge Analytica their data, but also gave the data on that user's friends
Given the recent events, it is clear that non-structural restrictions like fines and required audits on Facebook's conduct do not provide an adequate deterrence to unlawful behavior. Here again, the FTC can use its vast rulemaking powers to declare specific corporate acts as an unfair method of competition or an unfair or deceptive act or practice. The FTC can restrict how data can be shared across third parties outright or without clear, obvious, and repeated disclosure and permission from users. In general, companies should be required to present the data they collect in the clearest and obvious way possible and almost always be prohibited from sharing data with third parties.
Policy #3: The FTC can prohibit targeted advertising entirely.
Digital advertising is the primary mechanism Facebook uses to obtain its profits and the driving force behind Facebook's policies to collect massive amounts of user data. Just this past year, Facebook raked in $84 in revenue, 98% of which comes from digital advertising
With exorbitant amounts of potential profits to be made, it is obvious why the practice of digital advertising enables the need and desire to track users and collect as much data from them as possible. Facebook uses its data to, as the recent revelations by Haugen made clear, algorithmically drive harmful advertising content to users across its network. The evidence shows that digital advertising also causes users to be exposed to lurid, hyperbolic, and misleading content specifically designed to keep them engaged with the platform. For example, Facebook's research found its algorithms boosted "misinformation, toxicity, and violent content." Facebook's research also showed that Instagram makes more than 32% of teenage girls "feel worse" about their bodies. In other words, when combined with unregulated algorithms, digital advertising can operate as a form of overt psychological harm to users. Facebook's own internal research found that one in eight of its users believe that Facebook impairs their sleep, work, parenting, or relationships.
Digital advertising as a business practice is also the critical means to keep users addicted to the service and "regularly tap into people's motivations to influence their habits." The greater the addiction the greater the profits which also provides an increased opportunity to spread the kinds of harmful content that have created many of the problems the public is currently tolerating.
Again, the FTC's rulemaking powers provide ample authority to prohibit the practice and properly align business practices with publicly acceptable conduct.
Without significant structural regulatory action, it is practically inevitable that the same scandal-media outcry cycle will repeat itself and Facebook's business model will only be further tolerated and enabled. The FTC can act, and it should do so now.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Facebook is facing a political and regulatory siege on every conceivable front. The Federal Trade Commission (FTC) and 46 states are challenging the company's acquisitions of Instagram and WhatsApp--with divestiture being the sought-after remedy. The company's global head of safety testified to Congress in September to explain the company's recent efforts to attract more children to its digital properties. Merely a week later, whistleblower Frances Haugen proved to be a far more compelling witness and revealed the true extent of Facebook's knowledge of the harmful effects its products have on children and its fervent desire to collect data and extend its active user base to this "valuable but untapped audience." All these events also take place against a backdrop of the most significant congressional antitrust investigation in decades, five proposed antitrust bills in the House of Representatives seeking to deconcentrate the technology sector, and other repugnant acts the company has committed over the past decade. News scandals detailing Facebook's actions appear as an almost daily occurrence.
The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms.
And yet, despite warning letters from administrative agencies and Congress and multiple bouts of deception and misrepresentations to investors and public institutions, it's mostly business as usual at Facebook. Facebook continues to make vacuous or lackluster promises to reform its business model to align it with publicly acceptable market practices. But Facebook has repeatedly proven it is unwilling and incapable of addressing the problems manifested from its business model. The new revelations from the Facebook Files released by the Wall Street Journal in October show that Facebook consistently and deliberately sought to keep users engaged and placed its own economic interests over the safety of its users.
In no way was this crisis with Facebook inevitable. The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms. Nevertheless, in addition to vigorously pursuing its antitrust case against the corporation, the FTC also has vast congressionally delegated powers that it can use to prohibit specific corporate conduct and force Facebook to use more publicly acceptable business practices. Now is the time to fully use these powers.
Facebook Is the Result of Repeated and Deliberate Policy Failures
At its core, Facebook provides a digital space for users to share information, which the corporation collects and archives so that it can then refine sophisticated and black box algorithms to push digital advertisements and content to its unmatched global user base. The bedrock on which Facebook's services sit atop is the rules regarding user privacy, which is deeply connected to the data the corporation collects. Data and privacy share an intimate relationship because they are diametrically opposed: More privacy means less data to acquire and subsequently monetize.
Privacy has always been central to much of the scrutiny surrounding Facebook. With the release of its Beacon service in 2007, Facebook decided to make nearly every action its users took on the internet public on their newsfeed. The event sparked immediate outrage and was encapsulated by questions surrounding the methods that Facebook should be allowed to use to collect user data. In 2011, the FTC filed a complaint against Facebook for unfair and deceptive practices regarding its privacy controls. The company settled with the FTC and agreed to supposedly tight restrictions on how it could change its privacy policies. Much of the settlement involved Facebook implementing a privacy program to determine how to protect user privacy, obtaining users' affirmative consent to share data, and providing certified privacy audits to the FTC for 20 years. Despite the existence of the 2011 settlement and the concern surrounding user privacy, the FTC put almost no effort into reviewing Facebook's acquisition of Instagram in 2012 or WhatsApp in 2014. Both companies are now critical components of Facebook's business empire.
The Cambridge Analytica scandal in 2018 was one of the first major events that revealed the extent to which Facebook allowed third parties to use its vast repositories of user data and what users with nefarious motives can do with it. In the aftermath of the scandal, the FTC used the 2011 settlement to impose even more restrictions on Facebook's privacy policies. Some restrictions included the creation of an independent board to review decisions concerning user privacy and additional compliance requirements. Furthermore, using the 2011 settlement, the FTC fined the company $5 billion--a paltry sum compared to the $70 billion in revenue the company generated that year.
What is clear from these events is that despite the seriousness of Facebook's conduct, regulators took no action to challenge Facebook's underlying business model. But Facebook, like all companies, is a product of the legal paradigm it operates within, which shapes and incentivizes all business strategies. As evidenced by the company's repeated confrontations with federal agencies and Congress that resulted in no substantial action, its existence was and still is a political choice. It is now unmistakably apparent that the public should no longer tolerate Facebook as a corporate entity or those companies that emulate its business practices.
While the public waits for much-needed congressional action, the FTC has broad congressionally delegated antitrust powers to remedy the current crisis with Facebook and enact robust privacy protections for billions of users that can alleviate other societal ills resulting from similar internet services. In effect, because of Facebook's dominance, imposing new regulations on the company is really about altering our relationship with the internet and directly confronts the decades-long reign of corporate decisions and policy failures that have led us to this point.
The FTC Can Take Vigorous Action
When Congress created the FTC in 1914, it deliberately gave the agency the power to stop "unfair methods of competition" and in 1938 "unfair or deceptive acts or practices." These phrases were deliberately designed to be broad to ensure that Congress created an agency capable of watching over a constantly changing marketplace and quickly be able to prohibit practices that violated the ever-changing notion of publicly acceptable business conduct to ensure fair competition between corporate entities. As such, Congress gave and controlling judicial precedent affirms that the FTC possesses potent rulemaking powers that can be used to prohibit industry wide practices and "consider[] public values beyond simply those enshrined in the letter or encompass[ed] in the spirit of the antitrust laws."
Vigorously pursuing the antitrust case against Facebook must be one of the FTC's top priorities. At the same time, however, the FTC must use its broad rulemaking powers to ban the business practices that allow the harms Facebook creates to proliferate. Since data is the bedrock on which Facebook has built its digital empire, data usage and acquisition are what the FTC should target with new regulations. Here are a few policies the FTC could enact.
Policy #1: The FTC can limit what data can be collected.
Multiple surveys continue to confirm that users want significantly more privacy protections. But since companies like Facebook know that users and advertisers are incapable of leaving their platforms--combined with no serious threat of structural change or deterrence from the law being enforced--they have little reason to give customers the privacy protections they want, since it would essentially destroy the primary means by which their revenue is generated.
To properly remedy this situation and the past regulatory failures, rather than have Facebook or any other corporation determine what data is proper to collect from users, the FTC can establish clear rules. For example, the FTC can use its rulemaking powers to prohibit the collection of data on a user's race, sexual orientation, gender, or national origin. Such variables have been routinely exploited and used to push misinformation as well as other racist and hateful content to users. In one particularly egregious example, Facebook allowed housing marketers to target their ads based on race and religion, in violation of federal fair housing laws. The company was eventually sued by the Department of Housing and Urban Development over the same practices, more than two years after they first came to light.
Policy #2: The FTC can prohibit how companies share collected data.
Before a user is allowed to use an internet service, they are presented with an unreadable and overly lengthy terms of service. Buried in these non-negotiable contracts are terms that detail how a company can collect a user's data and what they can use it for. Companies exploit users by merely having them click "accept" without fully understanding what they agree to. If users don't agree to these terms, they cannot use the service.
Traditional economists and many legal scholars see these agreements as being fair, bargained for agreements that judges should view with a high amount of deference and a presumption of legality. Events like the Cambridge Analytica scandal exposed the extent to which these agreements create unknowable and unforeseen problems for users and why strict regulations are needed. In 2014, Facebook allowed third parties to use its vast repositories of user data by creating applications that connected with Facebook. This policy was embedded deep within Facebook's terms of service. Cambridge Analytica subsequently exploited this policy by creating a survey integrated with Facebook's login service. Users who logged into the service not only gave Cambridge Analytica their data, but also gave the data on that user's friends
Given the recent events, it is clear that non-structural restrictions like fines and required audits on Facebook's conduct do not provide an adequate deterrence to unlawful behavior. Here again, the FTC can use its vast rulemaking powers to declare specific corporate acts as an unfair method of competition or an unfair or deceptive act or practice. The FTC can restrict how data can be shared across third parties outright or without clear, obvious, and repeated disclosure and permission from users. In general, companies should be required to present the data they collect in the clearest and obvious way possible and almost always be prohibited from sharing data with third parties.
Policy #3: The FTC can prohibit targeted advertising entirely.
Digital advertising is the primary mechanism Facebook uses to obtain its profits and the driving force behind Facebook's policies to collect massive amounts of user data. Just this past year, Facebook raked in $84 in revenue, 98% of which comes from digital advertising
With exorbitant amounts of potential profits to be made, it is obvious why the practice of digital advertising enables the need and desire to track users and collect as much data from them as possible. Facebook uses its data to, as the recent revelations by Haugen made clear, algorithmically drive harmful advertising content to users across its network. The evidence shows that digital advertising also causes users to be exposed to lurid, hyperbolic, and misleading content specifically designed to keep them engaged with the platform. For example, Facebook's research found its algorithms boosted "misinformation, toxicity, and violent content." Facebook's research also showed that Instagram makes more than 32% of teenage girls "feel worse" about their bodies. In other words, when combined with unregulated algorithms, digital advertising can operate as a form of overt psychological harm to users. Facebook's own internal research found that one in eight of its users believe that Facebook impairs their sleep, work, parenting, or relationships.
Digital advertising as a business practice is also the critical means to keep users addicted to the service and "regularly tap into people's motivations to influence their habits." The greater the addiction the greater the profits which also provides an increased opportunity to spread the kinds of harmful content that have created many of the problems the public is currently tolerating.
Again, the FTC's rulemaking powers provide ample authority to prohibit the practice and properly align business practices with publicly acceptable conduct.
Without significant structural regulatory action, it is practically inevitable that the same scandal-media outcry cycle will repeat itself and Facebook's business model will only be further tolerated and enabled. The FTC can act, and it should do so now.
Facebook is facing a political and regulatory siege on every conceivable front. The Federal Trade Commission (FTC) and 46 states are challenging the company's acquisitions of Instagram and WhatsApp--with divestiture being the sought-after remedy. The company's global head of safety testified to Congress in September to explain the company's recent efforts to attract more children to its digital properties. Merely a week later, whistleblower Frances Haugen proved to be a far more compelling witness and revealed the true extent of Facebook's knowledge of the harmful effects its products have on children and its fervent desire to collect data and extend its active user base to this "valuable but untapped audience." All these events also take place against a backdrop of the most significant congressional antitrust investigation in decades, five proposed antitrust bills in the House of Representatives seeking to deconcentrate the technology sector, and other repugnant acts the company has committed over the past decade. News scandals detailing Facebook's actions appear as an almost daily occurrence.
The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms.
And yet, despite warning letters from administrative agencies and Congress and multiple bouts of deception and misrepresentations to investors and public institutions, it's mostly business as usual at Facebook. Facebook continues to make vacuous or lackluster promises to reform its business model to align it with publicly acceptable market practices. But Facebook has repeatedly proven it is unwilling and incapable of addressing the problems manifested from its business model. The new revelations from the Facebook Files released by the Wall Street Journal in October show that Facebook consistently and deliberately sought to keep users engaged and placed its own economic interests over the safety of its users.
In no way was this crisis with Facebook inevitable. The FTC had several opportunities to bring antitrust cases against the company and quash Facebook's business decisions. Instead, the agency chose to do nothing and allowed private corporations like Facebook to govern and structure technology markets on their own terms. Nevertheless, in addition to vigorously pursuing its antitrust case against the corporation, the FTC also has vast congressionally delegated powers that it can use to prohibit specific corporate conduct and force Facebook to use more publicly acceptable business practices. Now is the time to fully use these powers.
Facebook Is the Result of Repeated and Deliberate Policy Failures
At its core, Facebook provides a digital space for users to share information, which the corporation collects and archives so that it can then refine sophisticated and black box algorithms to push digital advertisements and content to its unmatched global user base. The bedrock on which Facebook's services sit atop is the rules regarding user privacy, which is deeply connected to the data the corporation collects. Data and privacy share an intimate relationship because they are diametrically opposed: More privacy means less data to acquire and subsequently monetize.
Privacy has always been central to much of the scrutiny surrounding Facebook. With the release of its Beacon service in 2007, Facebook decided to make nearly every action its users took on the internet public on their newsfeed. The event sparked immediate outrage and was encapsulated by questions surrounding the methods that Facebook should be allowed to use to collect user data. In 2011, the FTC filed a complaint against Facebook for unfair and deceptive practices regarding its privacy controls. The company settled with the FTC and agreed to supposedly tight restrictions on how it could change its privacy policies. Much of the settlement involved Facebook implementing a privacy program to determine how to protect user privacy, obtaining users' affirmative consent to share data, and providing certified privacy audits to the FTC for 20 years. Despite the existence of the 2011 settlement and the concern surrounding user privacy, the FTC put almost no effort into reviewing Facebook's acquisition of Instagram in 2012 or WhatsApp in 2014. Both companies are now critical components of Facebook's business empire.
The Cambridge Analytica scandal in 2018 was one of the first major events that revealed the extent to which Facebook allowed third parties to use its vast repositories of user data and what users with nefarious motives can do with it. In the aftermath of the scandal, the FTC used the 2011 settlement to impose even more restrictions on Facebook's privacy policies. Some restrictions included the creation of an independent board to review decisions concerning user privacy and additional compliance requirements. Furthermore, using the 2011 settlement, the FTC fined the company $5 billion--a paltry sum compared to the $70 billion in revenue the company generated that year.
What is clear from these events is that despite the seriousness of Facebook's conduct, regulators took no action to challenge Facebook's underlying business model. But Facebook, like all companies, is a product of the legal paradigm it operates within, which shapes and incentivizes all business strategies. As evidenced by the company's repeated confrontations with federal agencies and Congress that resulted in no substantial action, its existence was and still is a political choice. It is now unmistakably apparent that the public should no longer tolerate Facebook as a corporate entity or those companies that emulate its business practices.
While the public waits for much-needed congressional action, the FTC has broad congressionally delegated antitrust powers to remedy the current crisis with Facebook and enact robust privacy protections for billions of users that can alleviate other societal ills resulting from similar internet services. In effect, because of Facebook's dominance, imposing new regulations on the company is really about altering our relationship with the internet and directly confronts the decades-long reign of corporate decisions and policy failures that have led us to this point.
The FTC Can Take Vigorous Action
When Congress created the FTC in 1914, it deliberately gave the agency the power to stop "unfair methods of competition" and in 1938 "unfair or deceptive acts or practices." These phrases were deliberately designed to be broad to ensure that Congress created an agency capable of watching over a constantly changing marketplace and quickly be able to prohibit practices that violated the ever-changing notion of publicly acceptable business conduct to ensure fair competition between corporate entities. As such, Congress gave and controlling judicial precedent affirms that the FTC possesses potent rulemaking powers that can be used to prohibit industry wide practices and "consider[] public values beyond simply those enshrined in the letter or encompass[ed] in the spirit of the antitrust laws."
Vigorously pursuing the antitrust case against Facebook must be one of the FTC's top priorities. At the same time, however, the FTC must use its broad rulemaking powers to ban the business practices that allow the harms Facebook creates to proliferate. Since data is the bedrock on which Facebook has built its digital empire, data usage and acquisition are what the FTC should target with new regulations. Here are a few policies the FTC could enact.
Policy #1: The FTC can limit what data can be collected.
Multiple surveys continue to confirm that users want significantly more privacy protections. But since companies like Facebook know that users and advertisers are incapable of leaving their platforms--combined with no serious threat of structural change or deterrence from the law being enforced--they have little reason to give customers the privacy protections they want, since it would essentially destroy the primary means by which their revenue is generated.
To properly remedy this situation and the past regulatory failures, rather than have Facebook or any other corporation determine what data is proper to collect from users, the FTC can establish clear rules. For example, the FTC can use its rulemaking powers to prohibit the collection of data on a user's race, sexual orientation, gender, or national origin. Such variables have been routinely exploited and used to push misinformation as well as other racist and hateful content to users. In one particularly egregious example, Facebook allowed housing marketers to target their ads based on race and religion, in violation of federal fair housing laws. The company was eventually sued by the Department of Housing and Urban Development over the same practices, more than two years after they first came to light.
Policy #2: The FTC can prohibit how companies share collected data.
Before a user is allowed to use an internet service, they are presented with an unreadable and overly lengthy terms of service. Buried in these non-negotiable contracts are terms that detail how a company can collect a user's data and what they can use it for. Companies exploit users by merely having them click "accept" without fully understanding what they agree to. If users don't agree to these terms, they cannot use the service.
Traditional economists and many legal scholars see these agreements as being fair, bargained for agreements that judges should view with a high amount of deference and a presumption of legality. Events like the Cambridge Analytica scandal exposed the extent to which these agreements create unknowable and unforeseen problems for users and why strict regulations are needed. In 2014, Facebook allowed third parties to use its vast repositories of user data by creating applications that connected with Facebook. This policy was embedded deep within Facebook's terms of service. Cambridge Analytica subsequently exploited this policy by creating a survey integrated with Facebook's login service. Users who logged into the service not only gave Cambridge Analytica their data, but also gave the data on that user's friends
Given the recent events, it is clear that non-structural restrictions like fines and required audits on Facebook's conduct do not provide an adequate deterrence to unlawful behavior. Here again, the FTC can use its vast rulemaking powers to declare specific corporate acts as an unfair method of competition or an unfair or deceptive act or practice. The FTC can restrict how data can be shared across third parties outright or without clear, obvious, and repeated disclosure and permission from users. In general, companies should be required to present the data they collect in the clearest and obvious way possible and almost always be prohibited from sharing data with third parties.
Policy #3: The FTC can prohibit targeted advertising entirely.
Digital advertising is the primary mechanism Facebook uses to obtain its profits and the driving force behind Facebook's policies to collect massive amounts of user data. Just this past year, Facebook raked in $84 in revenue, 98% of which comes from digital advertising
With exorbitant amounts of potential profits to be made, it is obvious why the practice of digital advertising enables the need and desire to track users and collect as much data from them as possible. Facebook uses its data to, as the recent revelations by Haugen made clear, algorithmically drive harmful advertising content to users across its network. The evidence shows that digital advertising also causes users to be exposed to lurid, hyperbolic, and misleading content specifically designed to keep them engaged with the platform. For example, Facebook's research found its algorithms boosted "misinformation, toxicity, and violent content." Facebook's research also showed that Instagram makes more than 32% of teenage girls "feel worse" about their bodies. In other words, when combined with unregulated algorithms, digital advertising can operate as a form of overt psychological harm to users. Facebook's own internal research found that one in eight of its users believe that Facebook impairs their sleep, work, parenting, or relationships.
Digital advertising as a business practice is also the critical means to keep users addicted to the service and "regularly tap into people's motivations to influence their habits." The greater the addiction the greater the profits which also provides an increased opportunity to spread the kinds of harmful content that have created many of the problems the public is currently tolerating.
Again, the FTC's rulemaking powers provide ample authority to prohibit the practice and properly align business practices with publicly acceptable conduct.
Without significant structural regulatory action, it is practically inevitable that the same scandal-media outcry cycle will repeat itself and Facebook's business model will only be further tolerated and enabled. The FTC can act, and it should do so now.