The American Privacy Rights Act of 2024 (APRA)

The U.S. Congress has been trying for several years now to adopt comprehensive federal legislation for privacy protection. In April 2024, a bipartisan and bicameral bill proposal was introduced in the form of a "discussion draft"1. This follows various attempts, including the American Data Privacy and Protection Act (ADPPA)2 of 2022, from which it borrows some elements. After the adoption of laws by states such as California or Illinois, Congress is aiming to take the lead and define a singular national approach that preempts state laws3 and aims to avoid a patchwork of laws. APRA also echoes the GDPR. FR-Axeptio-ARPA


Initial reactions seem rather positive4, acknowledging a significant legislative turning point and the rise of data protection. However, it is uncertain whether the law can be adopted before the end of the year given the U.S. political calendar (November elections).

The 4 objectives of the American Privacy Acts

  • Provide Americans with a fundamental right to digital privacy.
  • Create a national law regulating how companies can use personal data.
  • Limit companies' ability to track, anticipate, and manipulate individuals' behaviors for profit without their knowledge and consent.
  • Establish a uniform national standard for data privacy and security in the United States.

Summary of the bill5

The bill aims to establish consumer data privacy rights on a national level. It would require covered entities to be transparent about how they use consumer data and grant consumers the right to access, correct, delete, and export their data, as well as the ability to opt-out of targeted advertising and data transfers. 

It sets data minimization standards, allowing companies to collect and use only data ("covered data") for necessary and limited purposes (15 are defined, e.g., "protecting data security", "defending claims") for a specific product or service requested by the consumer. This minimization is unprecedented in U.S. law, except possibly in the recently adopted Maryland law6, according to the International Association of Privacy Professionals (IAPP)7

The transfer of sensitive covered data to third parties without obtaining effective consumer consent is prohibited. Data of minors under 17 is treated as sensitive data.

The law would prohibit the use of sensitive data for discriminatory purposes against consumers (extended civil rights protection).

It would grant consumers the right to refuse the use of algorithms for decision-making. The use of "dark patterns" to deceive users is also prohibited. It introduces a three-tier enforcement system: the Federal Trade Commission (FTC), state attorneys general (or their chief consumer protection officers), and consumers could enforce law violations.

It thus establishes strict enforcement mechanisms to hold violators accountable and introduces a private right of action for individuals who may sue involved companies if they are harmed by a data breach.

This right allows for the enforcement of various provisions and the seeking of damages, injunctions, declaratory judgments, and "reasonable" legal and litigation costs. 

 

Cathy McMorris Rodgers, the bill's rapporteur for the House Committee on Energy and Commerce, adds: "It reins in Big Tech by prohibiting them from tracking, predicting, and manipulating people’s behaviors for profit without their knowledge and consent." 8

Main Provisions 

Scope of application9

The scope of “covered entities”10 is broad. It covers commercial companies and non-profit organizations but excludes the government and its agencies. It covers companies under FTC authority, as well as telecommunications operators under the Federal Communications Commission (FCC).

Service providers processing data for covered entities are also included. Small businesses with an annual turnover of less than $40 million, processing data of fewer than 200,000 individuals, and not deriving revenue from the transfer of covered data to third parties are excluded.

Key Definitions

 

  • "Covered algorithm" is a computational process that makes a decision or facilitates human decision-making by using covered data.
  • "Covered data" includes information that identifies or is linked or reasonably linkable to an individual, including in combination with other information.
  • "Individuals" means a natural person residing in the U.S.
  • "Sensitive data"11 is defined broadly to include data related to government identifiers; health; biometrics; genetics; financial accounts and payments; precise geolocation; log-in credentials; private communications; revealed sexual behavior; calendar or address book data, phone logs, photos and recordings for private use; intimate imagery; video viewing activity; race, ethnicity, national origin, religion or sex; online activities over time and across third-party websites; information about a minor under the age of 17; and other data the FTC defines as sensitive covered data by regulation.
  • "Third-party" means any entity that receives covered data from another entity, except service providers. All "covered entity" requirements apply to third parties, except sensitive data.

Specific obligations for other entities. 


In addition to companies falling within the scope of application, the law imposes additional obligations on different types of organizations:

1. "Large data holders," with an annual turnover exceeding $250 million, processing personal data of more than 5 million individuals, 15 million portable devices, and 35,000 connected devices, or processing sensitive data of more than 200,000 individuals, 300,000 portable devices, and 700,000 connected devices. These companies must publish their protection policy for the last ten years, annual reports on their transparency, create a "privacy" and "security officer", conduct biannual audits and impact assessments on protection, and submit an algorithm impact assessment to the FTC when they pose a risk of harm.

 

2. The bill introduces obligations for a new category of actors, "covered high-impact social media companies," whose platforms are primarily used to access or share user-generated content (UGC) when they generate an annual turnover of more than $3 million and have more than 300,000 monthly active users worldwide. User activities on the platforms must be considered sensitive data.

 

3. Finally, "data brokers" are those who either generate more than 50% of their revenue from processing or transferring data not collected directly from individuals, or who earn this revenue by processing such data if it comes from more than 5 million individuals.

These companies must also provide specific notices to consumers, register on the FTC-managed registry, and comply with non-collection requests made through the FTC-managed opt-out mechanism.


Sources:

1140 pages:  https://d1dth6e84htgma.cloudfront.net/PRIVACY_02_xml_005_6e97fe914c.pdf   
2Overview of the American Data Privacy and Protection Act”, ADPPA H.R. 8152: 
3 With exceptions (employee privacy, student privacy, data breach notifications, and health privacy) and the "California Consumer Privacy Act" and the "Biometric Information Privacy Act of Illinois." States are authorized to adopt their own laws regarding civil rights and consumer protection.
4 See Press Release Senate Commerce Committee: “What Others Are Saying: The American Privacy Rights Act”.
5 https://www.commerce.senate.gov/services/files/E7D2864C-64C3-49D3-BC1E-6AB41DE863F5 
6 https://mgaleg.maryland.gov/mgawebsite/Legislation/Details/SB0541?ys=2024RS 
7 https://iapp.org/news/a/top-takeaways-from-the-draft-american-privacy-rights-act/  
8 https://energycommerce.house.gov/posts/committee-chairs-rodgers-cantwell-unveil-historic-draft
comprehensive-data-privacy-legislation  
9 In the IAPP's "cheat sheet," you will find a one-page infographic detailing the main obligations by actor.  
10 The "covered entities" are analogous to the "controllers" of the GDPR, and the "service providers" are analogous to the "processors."
11 The definition of these data is broader than that of the GDPR according to TechPolicy. 

 

For increased trust and increased privacy, try Axeptio today!

Dr Jean Paul Simon is Founder of JPS Public Policy Consulting, a consulting firm specialised in media/telecom law regulation and strategy. He has held various positions in the telecom industry and has worked as a senior scientist at the Institute for Prospective Technological Studies, and the European Commission.

Newsletter

Related articles

Sogemedia x Axeptio for Publishers

Sogemedia x Axeptio for Publishers

Sogemédia, a key player in regional weekly press with titles such as L’Observateur and La Voix de l’Ain, has chosen Axeptio to manage its online consent handling. This partnership marks a...
Axeptio is a Google-certified CMP with Gold Status

Axeptio is a Google-certified CMP with Gold Status

Axeptio has achieved Gold status as a certified CMP partner of Google. This distinction confirms the quality of Axeptio's consent management platform, both in terms of its technical...
Achieve Compliance and Enhance UX with Axeptio on Webflow

Achieve Compliance and Enhance UX with Axeptio on Webflow

Discover how Axeptio integrates with Webflow to provide a seamless, compliant user experience. Learn about GDPR, Google Consent Mode v2, and other compliance needs for your website.