Moving Beyond Consent Models in Privacy Legislation

A few weeks ago, Public Knowledge convened a panel discussion: How Do We Move Beyond Consent Models in Privacy Legislation? This event featured keynote speaker Senator Sherrod Brown (@SenSherrodBrown) on the following panel including:

  • Sara Collins, Policy Counsel at Public Knowledge @SNolanCollins
  • Yosef Getachew, Director for Media & Democracy Program at Common Cause @ygetachew2
  • Nathalie Marechal, Senior Policy Analyst at Ranking Digital Rights @MarechalPhD
  • Stephanie Nguyen, Researcher at Consumer Reports @stephtngu
  • Joseph Turow, The Robert Lewis Shayon Professor at the University of Pennsylvania @JoeProf

 

Context of the panel discussion

 

The thrust of this panel was to expand on Senator Brown’s recent Data Accountability and Transparency Act of 2020 bill which aims to “upend years of debate over a potential federal privacy bill by shifting the burden away from consumers and onto companies,” The Washington Post highlighted. The bill focuses on other core elements such as data minimization, non-discrimination, accountability and transparency for algorithmic decisionmaking, and enabling individuals and state attorneys to enforce the law in court.

“It comes back to power,” Senator Brown articulated. “We can’t really call it consent — I guess that’s the word we use — when huge corporations use their power to coerce us into signing away our basic rights like privacy.”

The panelists reiterated the lack of choice in the consent paradigm, in addition to discussions of how this framework can be part of a corporate strategy to push people to give up more information. The requirement for consent leads Americans to be resigned about the whole process.

For context, Consumer Reports supports Senator Brown’s bill, and we also co-sponsored legislation in California, AB 1760 (2019) and AB 3119 (2020) that would reform the CCPA to ensure privacy by default.

In the following sections, we highlight key concepts from the discussion:

What are the challenges with models of consent?

 

There were several key concepts the group discussed to help outline some of the changing dynamics and nuances of the industry which creates difficulties in consent.

  • Consent through Internet of Things (IoT) Devices: We discussed how hardware-connected devices are permeating our homes and spaces that are typically considered private. So people must shift expectations and norms of devices that are usually offline — like children’s toys, home speakers and security systems. The difference pairing a mobile or desktop for setup, complicating already challenging processes like parent child consent and trying to humanize lengthy policies without having a direct user interface.
  • Data collection in public & private spaces: Especially with devices like Amazon Ring, these IoT devices shift norms with how people view our public and private spaces. Technology like this changes the whole notion in society that outdoor spaces are all public It’s tricky to come up with any kind of legal solution that accounts for both private individuals’ rights to privacy and journalists’ rights to document abuses of power and other events.
  • Distinguishing sensitive vs. non-sensitive data. We also discussed the challenges of data consent. One panelist suggested all data is sensitive data in some capacity. For example, most Americans who have access to a frequent shopper card for the supermarket, use it. One can take something benign about a person’s life (like purchasing over the counter drugs or whether the person eats red meat) and use other AI or machine learning to discover things about people that can be used in a discriminatory way.
  • Norms and structure of the digital advertising industry: The group discussed the business model to be able to collect, aggregate and share across platforms without recourse as problematic. Some questioned the value that is being proposed to advertisers — enabling an entire industry of middlemen or actors to extract rent by acting as intermediaries. In addition, research has shown that users do not know they’re being tracked and they don’t know what data mining is. That lack of understanding can result in people being discriminated against. We shouldn’t assume all of this digital advertising and tracking technology is accurate.

How do we move beyond consent models in privacy legislation?

 

Based on the panel conversation, here is a summary of the potential ideas we discussed spanning areas of change in policy and legal, research, industry and cultural practice.

Policy and legal concepts:

  • Implement data minimization. Put the burden back on companies to not collect everything and reduce the amount of data that companies collect through certain standards. Data collection should be necessary and proportionate to deliver what the consumer expects out of the product.
  • Incorporate non-discrimination principles for those people who decide to exercise privacy choices or who don’t agree to secondary data collection or use.
  • Civil rights protections must also apply to the digital ecosystem. Companies should be prohibited from discriminating against members of marginalized communities through personal data.
  • Data processing should be constrained by default. Data sensitivity is dependent on context such as financial, health, and children’s data. Panelists also discussed a potential benefit of considering all data sensitive data as a default.
  • Emphasize civil rights protections and how they are targeted to people of color and other marginalized communities.
  • Explore models of enforcement. How might we explore strengthened measures for external accountability with the FTC, state level, or other enforcement bodies?
  • Encourage private right of action. There should be the ability to allow individuals the right to sue or file for class action lawsuits within states. There is a need for robust enforcement all around including the Paperwork Reduction Act (PRA), but also stronger regulation than ones we have today.
  • Prison sentences for accountable executives. Companies should be held accountable to send corporate leadership to prison if they lie about or violate people’s privacy.

The panelists focused most on policy interventions as the primary factor to provoke change from data collecting platforms and companies. However, we also spoke about research opportunities along with industry and cultural shifts to help strengthen self-regulation and evolution in industry below.

Research ideas:

  • Look to how policymakers can prohibit certain uses of data regardless of how it’s collected in order to protect people. Ad targeting for example, is a huge issue with election disinformation and voter suppression.
  • Create more transparency in how digital advertising makes revenue for consumers and regulators. We need to better understand how middleman companies work in addition to ad servers, software and automated systems and decision making including how data is being sorted through and unfolds in a real time basis, etc.
  • Explore how digital advertising is disparately impacting marginalized communities and people of color. It is important to continue refining the elements that may influence consumers’ responses to advertising, specifically from groups who have been traditionally targeted or excluded from access to opportunities.

Industry ideas:

  • Consider alternative advertising models. Create mechanisms that can push the advertising industry to focus on (for example) contextual advertising as opposed to data aggregation and behavioral advertising.
  • Establish “monitor” roles within companies. In the EU, they have monitors within companies, delegated to make sure policies are carried out. There is not a required counterpart like this in the US. We note that this may help on the margins, but if monitors are enforcing flimsy interpretations of the law, this idea may not have the intended effect.
  • Establish more “privacy default” norms in products and services. Instead of having scenarios where users will need to opt-out of invasive data collection, companies should not do invasive processing that isn’t necessary for the function to work (and that doesn’t include monetization).

Cultural shifts:

  • Shift our language away from the word “consent” to “How are we using your data?” Consent tends to anchor this concept in one moment in time as opposed to a series of moments where an exchange of data may occur in order to obtain value from a product, service or platform in some capacity. In general, people shouldn’t have to know or worry about managing details of data collection and use. Ideally they should feel comfortable knowing that nothing shady is happening, can find out more if they want, but there should not be a cognitive load on consumers to understand this information.

There is no single method or angle to move the “beyond consent models” — the panelists discussed a multi-prong approach and participation from many stakeholders in order to make sustainable and substantive change. The aim is to take the burden off consumers and put the onus back on big tech to scale back permitted uses of personal data to protect digital user rights.

“We want to see a world where you can have confidence that most of your data is protected from collection entirely, and if you do authorize data to be used by a company for some reason […] it shouldn’t be sold to other people or used for any other purposes,” Senator Brown explained. “That’s what real power over data looks like.”


 

Many thanks to Public Knowledge for hosting and organizing this session and for the panel participants. You can view the webinar here:

Related articles shared for the panel:

You can also email your members of Congress to ask for comprehensive privacy legislation at publicknowledge.org/DataProtection.

Get the latest on Innovation at Consumer Reports

Sign up to stay informed

We care about the protection of your data. Read our Privacy Policy