fbpx

Privacy Law 2023 “State of Play”

Privacy Law 2023 State of Play
In the world of privacy law, the passage of new regulatory frameworks is being contemplated regularly, and existing laws are constantly being updated in terms of guidance and amendments. The result is an immensely dynamic regulatory landscape. In light of this incredibly fast-moving area, it is challenging to understand the current “state of play”, including a given business’s compliance obligations. Here, we provide a brief overview of some of the most notable developments in the world of privacy law and data protection in 2023 to provide some clarity.

No Comprehensive U.S. Federal Privacy Law But An Ever-Expanding Patchwork Of State Laws

While there is currently no comprehensive federal privacy law (there are sector-specific “privacy laws such as HIPAA) regulating the processing of personal information, there are attempts to pass such as law, such as most recently via the American Data Privacy and Protection Act (ADPPA). Without a federal law, the regulatory void is filled by states passing their own comprehensive laws. Currently, the states with comprehensive privacy laws in effect or that will go into effect this year are California, Utah, Connecticut, Colorado, and Virginia. In addition, many other states are considering the passage of similar laws.

The practical result of the patchwork regulatory framework is increased complexity and cost for businesses that have to navigate a myriad of different laws. For example, the Information Technology and Innovation Foundation estimates the out-of-state costs if each state passed its own privacy laws could exceed $1 trillion over ten years, with at least $200 billion directly impacting small businesses.

A Focus On Profiling And Automated Decision-making

While laws such as the European Union’s GDPR or state laws such as California’s CCPA as amended by the CPRA differ on the specifics, they and other privacy laws impose restrictions on profiling and automated decision-making (ADM) due to the potentially harmful effects it can have on individuals. Generally, “profiling” refers to gathering information about an individual and evaluating their characteristics or behavior patterns, such as predicting their performance ability in the employee context or purchasing behavior. “Automated decision-making,” on the other hand, refers to making a decision based on such data. There are a lot of nuances when it comes to what a particular privacy law deems to be profiling or ADM, including whether the decision has “‘legal or similarly significant effects” as well as whether there is a human involved in the decision-making process. Regardless though, If engaging in such activities, it is essential to analyze what the corresponding compliance requirements are.

Beware Of The Sensitive Treatment Of “Sensitive Information”

“Sensitive information” is another area of particular focus for regulations in the context of privacy law. Though defined differently from law to law, as with profiling and automated decision-making, the sensitive nature of the information and the potentially detrimental effects its use or disclosure can have on people puts an additional regulatory focus. As an example of what is regulated under laws that impose other obligations for “sensitive information,” under California’s CCPA, as amended by the CPRA, “sensitive personal information” means:

(1) Personal information that reveals:

(A) A consumer’s social security, driver’s license, state identification card, or passport number.

(B) A consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account.

(C) A consumer’s precise geolocation;

(D) A consumer’s racial or ethnic origin, citizenship or immigration status [added in October 2023 under the expanded definition of “Sensitive Information”], religious or philosophical beliefs, or union membership.

(E)The contents of a consumer’s mail, email, and text messages unless the business is the intended recipient of the communication.

(F) A consumer’s genetic data.

(2)(A) The processing of biometric information for the purpose of uniquely identifying a consumer.

(B) Personal information collected and analyzed concerning a consumer’s health.

(C) Personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

(3) Sensitive personal information that is “publicly available” pursuant to paragraph (2) of subdivision (v) shall not be considered sensitive personal information or personal information.

The result of this focus on sensitive personal information on the part of regulators is greater compliance obligations that a business must account for, including data mapping, impact assessments, the nuance between opt-in (VCDPA) and opt-out (CPRA) regimes, and accounting for consumer rights such as the right to limit the use of sensitive information.

Data Minimization Is Becoming A Requirement

With the value that data holds for businesses, there is a great temptation to collect as much data as possible and retain it for as long as possible. While this approach has many good business rationales, inherent risks accompany such a maximalist modus operandi. The risks include a greater chance of subsequent damage from a data breach and an increase in the chance of improper use of the data. Data maximization also increases the likelihood of customer expectations not aligning with business practices. Specifically, a user might not expect that certain personal information is collected at all, or perhaps the customer might be aware that it is collected but not aware that it will be used for certain purposes or retained indefinitely. In light of these risks, privacy laws such as the GDPR and the CCPA, as amended by the CPRA, require data minimization. While there is nuance in what each law requires, in essence, data collected should be used only for the purposes disclosed and expected by customers. In addition, indefinite retention periods are no longer allowed.

Renewed Focus On “Health Information” And Location Data

In part because of the generally greater sensitivity of location and health information and the Dobbs decision, regulators are increasingly focusing on the use, disclosure, and storage of location and health information, even if not the kind of information that falls within the scope of HIPAA. For example, the FTC has stated “there is a behind-the-scenes irony” in that “data that people choose not to disclose even to family, friends, or colleagues is actually shared with complete strangers.” In addition, the FTC has taken the position that “misuse” of certain kinds of sensitive information can be “illegal” under Section 5 of the FTC Act. In line with this enforcement focus, the FTC issued an order banning BetterHelp from disclosing consumer mental health information to various advertising partners and paying a fine of $7.8 million for “deceiving consumers” in contravention of promises to keep sensitive personal data private.

International Data Transfers Remain A Moving Target

One of the most significant challenges for companies concerns the international transfer of personal data from the European Union to the United States. The friction mainly stems from differences in data protection laws between the two jurisdictions. While the E.U. has strict data protection laws under the General Data Protection Regulation (GDPR), which governs the collection, storage, and use of personal data of E.U. residents, the U.S., on the other hand, has a more fragmented data protection regime. More notably, though, “Schrems II” was an E.U. decision that ruled that the provisions of U.S. laws do not satisfy requirements that are essentially equivalent to those required under E.U. law. As a result, the CJEU determined that U.S. surveillance programs, which allow government agencies to access the personal data of non-US citizens, are incompatible with E.U. data protection laws. The decision rendered the Privacy Shield (previously the critical mechanism for data transfers from the E.U. to the USA) inoperable.

In the absence of an “adequacy decision,” companies have undertaken resource-intensive initiatives to implement additional measures to allow for data transfers, such as standard contractual clauses, binding corporate rules, or obtaining explicit consent from data subjects, to ensure that personal data is adequately protected when transferred from the E.U. to the U.S.

There is a reason for optimism for a new adequacy decision, though. In October 2022, President Biden signed an Executive Order aimed at implementing the European Union-U.S. Data Privacy Framework. Now there are several procedural steps that need to be navigated before we will know if a new adequacy decision will be implemented.

A Challenging Compliance Environment, But Key Steps Go A Long Way

It can be overwhelming for businesses, especially those with limited resources, to understand their compliance obligations and meet them within a confusing and ever-changing privacy law and data protection dynamic. Still, there are key steps that go a long way toward achieving compliance with the fundamental tenets of the most broadly applicable privacy regulations.