Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Fri, 19 Jul 2024 06:04:32 -0400 60 hourly 1 AI Legislative and Regulatory Efforts Pick Up Steam: What We’re Watching https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ai-legislative-and-regulatory-efforts-pick-up-steam-what-were-watching https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ai-legislative-and-regulatory-efforts-pick-up-steam-what-were-watching Wed, 03 Jul 2024 13:00:00 -0400 AI capabilities are growing by the day, and with them, so are increasing government efforts to put in place guardrails, principles, and rules to govern the AI space. In May alone, Utah’s Artificial Intelligence Policy Act became the first state-level AI law to take effect, Colorado and Minnesota enacted new laws addressing AI, and the European Union passed historic comprehensive AI regulations. Meanwhile, the FTC continues to issue AI-related guidance materials that emphasize the importance of transparency in human-AI interactions, especially those involving native advertising (prior guidance here and here). As we continue to monitor the flurry of activity underway, we outline below new laws and important bills, standards, and initiatives to monitor.

Federal Efforts

American Privacy Rights Act

Last week, the House Energy and Commerce Committee abruptly canceled a scheduled markup of the latest American Privacy Rights Act (APRA) discussion draft, Congress’s most recent comprehensive privacy proposal. Some privacy advocates welcomed the cancellation, strongly opposing the removal of AI and civil rights protections in the latest draft. These protections included prohibitions against algorithmic discrimination and requirements for transparency and impact assessments for AI systems.

At present, it seems APRA may not advance as far as the 2022 American Data Privacy and Protection Act, which was passed out of the Energy and Commerce Committee but ultimately never received a floor vote. With the August recess and October break ahead of the November elections approaching, the likelihood of any comprehensive privacy legislation reaching the House floor this year seems dim. However, we will continue to monitor these federal legislative efforts and their potential impact on AI providers.

White House Executive Order

Last year, the White House released the federal government’s first comprehensive guidelines regarding AI. Although the Executive Order focuses almost entirely on the government’s own use of AI, the ultimate effects of the order will be significant for private sector businesses engaging with federal agencies.

Pursuant to the Executive Order, on April 29, 2024, NIST released a draft risk management profile specifically addressing generative AI. The Generative AI Profile—which is intended as a companion resource to NIST’s AI Risk Management Framework—offers voluntary best practice guidance regarding the design, deployment, and operation of generative AI systems. As states continue to draft AI legislation, the NIST AI Risk Management Framework will likely continue to serve as an instructive reference point for legislators across the country.

State Legislation

Colorado AI Act

The Colorado AI Act, SB 205, is now set to take effect February 1, 2026, although the freshly-signed law is already slated for revisions: in a recent letter, Gov. Jared Polis, AG Phil Weiser and Senate Majority Leader Robert Rodriguez acknowledged that “a state by state patchwork of regulation” on AI poses “challenges to the cultivation of a strong technology sector” and promised to engage in a process to revise the new law to “minimize unintended consequences associated with its implementation.”

As drafted, the law introduces new obligations and reporting requirements for both developers and deployers of AI systems. Key requirements include:

  • Transparency. Moving forward, any businesses that use AI systems to interact with consumers must disclose this fact during consumer interactions.
  • Algorithmic Discrimination in High-Risk AI Systems. The new law seeks to combat “algorithmic discrimination,” where the use of AI results in outcomes that disfavor consumers based on several personal and sensitive data categories. High-risk AI systems are defined as systems used to make decisions about individuals in the areas of education, employment, finance or lending, government services, healthcare, housing, insurance, and legal. Developers and deployers of such systems have a duty to use reasonable care to protect consumers from known or reasonably foreseeable risks of algorithmic discrimination, and the law identifies specific obligations such entities must undertake.
  • Consumer Notice, Correction, and Opt-Out Rights. Consumers must be notified when high-risk AI systems are used to make any decisions about them in the areas outlined above (e.g., education, employment, etc.), and must have the right to correct inaccurate data and appeal the decision to a human reviewer.
  • Existing Obligations Under the Colorado Privacy Act (CPA). Deployers must also respect the existing rights of consumers under the CPA, including the right to opt-out of the processing of personal information for profiling with legal or similarly significant effects concerning the consumer, including decisions made using AI. In April, Colorado amended the CPA’s definition of sensitive data to include both biological and neural data used either in insolation or in combination with other personal data elements for identification purposes. The CPA additionally creates AI-related disclosure obligations, requiring businesses to provide privacy policy language that details the personal data categories used for profiling, a plain-language explanation regarding the AI logic in use, explanations describing its benefits and potential consequences, and text explaining whether the system has been evaluated for accuracy, fairness or bias.
  • Enforcement. The Colorado attorney general has sole authority to enforce the Colorado AI Act, and the law includes no private right of action. Violations are considered breaches of Colorado's general consumer protection laws, which can result in a maximum civil penalty of $20,000 per violation. Notably, each violation is counted individually for every affected consumer or transaction. Consequently, just 50 impacted consumers could result in a maximum civil penalty of $1 million. Actions must be brought within three years of the violation occurring, or from the time when the violation was discovered.

We’ll keep an eye on whether all these requirements survive the revision process suggested above.

Utah Artificial Intelligence Policy Act

On May 1, 2024, Utah’s Artificial Intelligence Policy Act, SB 149, became effective. Generally, Utah’s legislature has pursued a far lighter touch to AI regulation than Colorado. Key takeaways include:

  • Disclosure Upon Request. Most businesses and individuals will only be required to disclose the use of AI when prompted by a consumer.
  • Disclosing the Use of AI in Regulated Professions. Businesses and individuals operating within regulated professions (e.g., healthcare professionals) must prominently disclose the use of AI before its use with customers.
  • Responsibility for Generative AI Outputs. Companies are responsible for the outputs of their generative AI tools and cannot pass on blame if those tools violate Utah consumer protection laws.

Comprehensive State Privacy Laws

Twenty states have now passed comprehensive state privacy laws: California, Colorado, Connecticut, Delaware, Florida, Indiana, Iowa, Kentucky, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, Oregon, Rhode Island, Tennessee, Texas, Utah, and Virginia. These states, with the exceptions of Utah and Iowa, impose additional requirements on companies engaging in “profiling,” which is defined as the automated processing of personal data to analyze or predict something personal about an individual, such as one’s economic situation, behavior, health, or personal preferences. Under these laws, consumers must be able to opt-out of being profiled in a manner that could lead to a “legal effect” on that consumer or another “similarly significant effect.” Although a few of these laws are currently effective, the majority come into effect over the next few years. Here are the key dates to keep mind:

  • Effective in 2024. Florida, Montana, Oregon, and Texas have comprehensive privacy laws coming into effect in the next several months.
  • Effective in 2026. Kentucky and Indiana have enacted comprehensive data privacy laws that will become effective on Jan. 1, 2026. The Rhode Island legislature also passed the Rhode Island Data Transparency and Privacy Protection Act, SB 2500 / HB 7787, on June 13, 2024. If signed, the law will also become effective on Jan. 1, 2026.

California Privacy Protection Agency Initiatives

The California Privacy Protection Agency is currently considering rules and engaging in pre-formal rulemaking stakeholder sessions regarding the use of automated decision making technology (ADMT). California defines ADMT as technology that collects, uses, retains or discloses personal information and either replaces or substantially facilitates human decision making. Algorithmic “profiling,” discussed above, is encompassed within this definition. Examples include resume-screening tools used by businesses to decide whether to interview applicants and analytics tools that place consumers into audience groups to further target them with advertising.

Businesses subject to the California Consumer Privacy Act (CCPA) and that use ADMT for “extensive profiling,” to make “significant decisions” regarding consumers, or that use personal information to train ADMT would be subject to new transparency and opt-out requirements. Behavioral advertising, the practice of tracking users’ online activities to deliver ads tailored to their interests, is included within the definition of “extensive profiling.” Further discussion regarding the terms “extensive profiling” and “significant decisions” can be found here. Businesses would be required to offer a pre-use notice informing consumers of how the company uses ADMT and of the individual’s CCPA opt-out rights.

Ongoing Legislative Efforts

Currently, a multitude of states, including New York, California, and Massachusetts, are working on proposed AI governance bills. In addition, new legislation in Illinois addressing AI usage currently awaits the Governor’s signature.

  • California. The Assembly recently advanced multiple bills addressing AI usage. These bills include provisions prohibiting algorithmic discrimination and would establish new compliance and reporting requirements for AI providers. Additionally, these bills would require businesses to implement watermarking systems identifying AI-generated content and to publicize information regarding the methods used to train AI models.
  • Illinois. On May 24, 2024, the Illinois legislature passed HB 3773, amending the Illinois Human Rights Act by adding new provisions regarding the use of predictive data analytics for employment and credit decisions.

Europe

The EU AI Act

On May 21, 2024, the EU Council unanimously passed the EU AI Act (AIA). Businesses, whether EU-based or not, should pay close attention to the upcoming changes for two reasons. First, the AIA applies to all providers of AI systems placed on the EU market, regardless of where the provider is located. Second, the penalties for non-compliance are some of the toughest in the world, allowing for fines up to €35 million EUR or 7% of a company’s annual revenue.

Broadly, the AIA creates a risk classification scheme, which places AI systems into one of several categories. The categories are:

  • Unacceptable Risk. AI systems constituting an unacceptable risk are prohibited entirely. These include systems used to manipulate or exploit individuals, classify or evaluate individuals based upon their personal traits, and emotion-recognition systems used in workplace and educational contexts.
  • High Risk. The AIA defines high risk systems as those presenting a significant risk to health, safety, or fundamental rights. Examples of AI systems falling under this category include those used in education, employment, healthcare, and banking settings. Providers of high-risk systems are subject to a number of strict regulations, including required registration in a public EU database. Additionally, providers of these systems must perform regular impact assessments and implement procedures that ensure transparency, security, and human oversight of their systems.
  • Limited Risk. For systems posing limited risks, such as chatbots interacting with humans and AI-generated content, the AIA imposes transparency obligations to ensure humans are informed that an AI system was involved. Providers of AI-generated content must ensure it is identifiable as such.
  • Minimal or No Risk. Minimal-risk AI uses, which present little to no risk to the rights or safety of individuals, can be freely used under the AIA. Examples include AI-enabled video games and spam filters. Most AI systems currently deployed are likely to fall under this category.
  • General Purpose AI (GPAI). GPAI refers to AI systems trained on broad datasets capable of serving a variety of purposes. Popular examples include OpenAI’s ChatGPT and DALL-E programs. Providers of GPAI models are required to produce technical documentation and release detailed summaries of their training data. For GPAI models that present systemic risks, providers must also implement cybersecurity measures, mitigate potential risks, and perform evaluations that include adversarial testing.

We will continue to monitor these ongoing state, federal, and international AI legislative efforts and provide you with the latest updates to help you prepare for what lies ahead.

Summer Associate Joe Cahill contributed to this post

]]>
Arkansas AG Files Suit, Labels Temu a Data-Theft Business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/arkansas-ag-files-suit-labels-temu-a-data-theft-business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/arkansas-ag-files-suit-labels-temu-a-data-theft-business Tue, 02 Jul 2024 11:00:00 -0400 Tim Griffin, Arkansas Attorney General, did not mince words when he filed a lawsuit against the parent companies of Temu, stating in a press release, “Temu is not an online marketplace like Amazon or Walmart. It is a data-theft business that sells goods online as a means to an end.” He further commented that, “…Temu is functionally malware and spyware.” The 51-page complaint was filed against WhaleCo. Inc. d/b/a Temu, and its owner, Chinese e-commerce company PDD Holdings Inc. Like General Griffin’s press release, the complaint leaves little to the imagination about the state’s feelings toward the popular online shopping platform, which, according to the complaint, was the most downloaded app in the United States in 2023 and is responsible for tens of millions of shipments into the country each year.

Arkansas brought the action pursuant to the Arkansas Deceptive Trade Practices Act (ADTPA), which prohibits deceptive and unconscionable business practices, and the Arkansas Personal Information Protection Act (PIPA), which requires businesses protect data concerning Arkansas residents with reasonable security practices. Numerous allegations by the state against Temu regarding app user personal information include that it is:

  • Using the inducement of low-cost Chinese-made goods to lure users into unknowingly providing near-limitless access to their PII,
  • Misleading users regarding how it uses their data,
  • Not allowing users to avoid being tracked on the internet,
  • Collecting virtually limitless amounts of data, and in addition to Bluetooth and Wi-Fi access, gaining full access to user contacts, calendars, and photo albums, plus all social media accounts, chats, and texts,
  • Gaining permission from user devices upon app installation to subsequently install any further program it wishes without user knowledge or control,
  • Obtaining personal information in a way that is purposely secretive and intentionally designed to avoid detection,
  • Providing or selling user data to unauthorized third parties or using user data in a way that users did not authorize,
  • Potentially harvesting data of non-users who have communicated with users, and
  • Subjecting user data to misappropriation by Chinese authorities.

The state cites reports and third-party research throughout its complaint to support its claims against Temu. Temu, in response, said the company was “surprised and disappointed” by General Griffin filing the lawsuit without what the company called “any independent fact finding.”

In addition to the allegations regarding privacy harms of the app, the state claims defendants make deceptive representations about the quality of the goods sold, which it says are frequently counterfeit, and that users experience undelivered packages and poor customer service. The state further claims Temu uses “false-reference pricing,” in which a retailer represents to a prospective customer that a product is on sale at a steep discount when the “discounted price” is the product’s regular market price. The state’s allegations against Temu also include the use of fake and deceptive reviews, a concern to many AGs, and the claim that Temu failed to take adequate measures to protect minors, among others.

The state requests the court preliminarily and permanently enjoin defendants from treating Arkansas consumers unlawfully, unconscionably, and deceptively as described in the complaint, plus requests civil penalties and other monetary and equitable relief. Temu said, “We categorically deny the allegations and will vigorously defend ourselves.”

Arkansas is not the only state concerned about Temu’s practices. In 2023, Montana banned the download of the Temu app (and a handful of other apps) on devices issued by the state or connected to the state network based on its affiliation with foreign adversaries. The state also banned third-party firms conducting business for or on behalf of the state of Montana from using Temu and the other apps in question.

This action demonstrates the breadth of authority provided to state AGs, especially related to consumer protection. AGs routinely use UDAP laws to bring expansive matters, which can evolve from looking at one issue into many. Here, while there is much attention to the allegations related to Temu’s data collection and security practices, the other general UDAP violations on the quality of products and fake reviews show how multiple areas of misconduct claims will be addressed by AGs. All of these areas continue to be a hot topic for state AGs, as are companies with foreign affiliations thought of as problematic. As we have discussed many times before, the AGs often work together in their missions to protect consumers; only time will tell if more AGs follow with complaints against the popular shopping app.

]]>
Farewell to the two-step: Supreme Court overrules Chevron https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/farewell-to-the-two-step-supreme-court-overrules-chevron https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/farewell-to-the-two-step-supreme-court-overrules-chevron Mon, 01 Jul 2024 13:00:00 -0400 In a big week for administrative law watchers, the Supreme Court issued a pair of 6-3 decisions paring back the powers of administrative agencies. In Loper Bright Enterprises v. Raimondo, the Court overruled Chevron U.S.A. v. Natural Resources Defense Council, Inc., and in Jarkesy v. S.E.C. it held that the Seventh Amendment prohibits agencies from seeking civil penalties for suits resembling actions at common law before administrative tribunals. Taken together, these cases demonstrate the Court’s focus on separation of powers. Below, we consider their potential impact on the Federal Trade Commission.

Loper Bright Enterprises v. Raimondo

Loper Bright overrules Chevron, eliminating the deference given to agencies’ interpretations of ambiguous statutes. Since 1984, courts have employed a two-step framework when reviewing an agency’s statutory interpretations. First, courts determined “whether Congress has directly spoken to the precise question at issue.”[1] If it had not and there was any ambiguity, courts moved on to the second step, deferring to the agency’s interpretation if it was a “permissible construction of the statute.”[2] Under Chevron, courts were obliged to cede their independent judgment to an agency’s reasonable interpretation.

In Loper Bright, the Supreme Court held that this deference “defies the command of the [Administrative Procedure Act] that ‘the reviewing court’—not the agency whose action it reviews—is to ‘decide all relevant questions of law’ and ‘interpret . . . statutory provisions.’”[3] Agencies, according to the Court, “have no special competence at resolving statutory ambiguities. Courts do.”[4] Going forward, courts must do what they do best and “use every tool at their disposal to determine the best reading of the statute and resolve the ambiguity.”[5]

For forty years, Chevron has given agencies an advantage in disputes over the statutes they administer. It has required courts to defer to agency interpretations, even when the agency’s interpretation has evolved (sometimes dramatically) over time. In his concurring opinion, Justice Gorsuch points to the FCC’s changing rules regarding the classification of broadband internet across four administrations as an example of the “regulatory whiplash that Chevron invites.”[6]

The death of Chevron deference will be felt across administrative agencies, including at the Federal Trade Commission (“FTC” or “Commission”). Most immediately, the decision may impact the Commission in legal challenges it faces to two rules: the Non-Competes Clause Rule and the Combating Auto Retail Scams (CARS) Rule. Previously, if there was a dispute about the meaning of the statute at issue, the FTC could simply point to Chevron and expect to prevail. For example, in National Automobile Dealers Association v. Federal Trade Commission, the court applied Chevron to determine that the FTC’s interpretation of “uses” under the Fair and Accurate Credit Transactions Act of 2003 was reasonable and entitled to deference.[7] [Full disclosure: the author worked on this case on behalf of the FTC.] Going forward, a reviewing court will now use its own judgment to resolve any ambiguity.

Beyond rulemaking, Loper Bright could also potentially impact any FTC efforts to expand its authority under Section 5 of the FTC Act. Section 5 is famously broad (and, some might even say, ambiguous). Entities and individuals that find themselves in a dispute with the agency over statutory questions, such as what constitutes an unfair method of competition, may now press their case before a “neutral party ‘to interpret and apply’ the law without fear or favor … .”[8]

Jarkesy v. S.E.C.

In Jarkesy v. S.E.C., the Court ruled that the Seventh Amendment right to a jury trial requires that the SEC seek civil penalties for securities fraud in district court. The Court found that, where an agency’s claims are legal in nature, they must be tried before a district court. In determining whether claims are legal (and not, for example, equitable or admiralty), courts must consider both the remedy and cause of action. When a monetary remedy is meant to punish rather than “restore the status quo,” it is legal rather than equitable, and implicates the Seventh Amendment.[9] In addition, the Court found that the “public rights” exception (Congress’s ability to assign certain matters to an agency without a jury) did not apply, because fraud is in the nature of an action at common law. Suits akin to those brought in common law “presumptively concern private rights, and adjudication by an Article III court is mandatory.”[10]

While this decision will affect administrative agencies beyond the SEC, its immediate impact on the FTC is likely to be limited. The Commission is already required to go to district court to obtain civil penalties. It is an open question whether there might be other remedies sought by the FTC in administrative litigation that are likewise legal in nature and implicate private rights, which may no longer be obtained in-house. For example, the FTC has sometimes issued cease and desist orders in deception cases which ban Respondents from doing business in a particular industry.[11] If this remedy is viewed as punitive and implicates a core private right, it could only be imposed by an Article III court. Other novel remedies the Commission has obtained in settlements, such as so-called algorithmic disgorgement, may also be off-limits in an agency cease and desist order.

Because the Seventh Amendment jury trial requirement resolved the case, the Court did not reach two other questions that could have affected the FTC. First, the Court did not determine whether the ability of an administrative agency to determine whether to bring an action administratively or in federal court violates the non-delegation doctrine. Second, the Court did not consider whether an ALJ’s two layers of for-cause removal protections violates separation of powers. These and other issues, such as whether the FTC’s role as both prosecutor and judge in administrative proceedings constitutes a due process violation, will continue to be the subject of litigation. If these questions ultimately find their way back to the Court, at least two Justices have signaled that Article III and the Due Process Clause of the Fifth Amendment also limit agencies’ ability to handle certain matters administratively. In a concurrence, Justice Gorsuch, joined by Justice Thomas, writes that, because the SEC would deprive Jarkesy of property, “due process demands ‘nothing less than the process and proceedings of the common law… .’”[12] This means use of an Article III court and “not the use of ad hoc adjudication procedures before the same agency responsible for prosecuting the law, subject only to hands-off judicial review.”[13]

The fight against the so-called administrative state seems destined to continue for a bit longer. We will keep you posted.

Notes

[1] Chevron U.S.A., Inc. v. Nat. Res. Def. Council, Inc., 467 U.S. 837, 842 (1984).

[2] Id. at 843.

[3] Loper Bright Enterprises v. Raimondo, 603 U.S. __(2024) (slip op., at 21).

[4] Id. (slip op., at 23).

[5] Id.

[6] Gorsuch, J., concurring at 24.

[7] Nat’l Auto. Dealers Ass’n, v. FTC, 864 F. Supp. 2d 65 (D.D.C. May 22, 2012).

[8] Id. at 6.

[9] SEC v. Jarkesy, 603 U.S. ___(2024) (slip op., at 9).

[10] Id. at 14.

[11] See, e.g., In the Matter of Traffic Jam Events, Docket No. 93-95 (Oct. 25, 2021), https://www.ftc.gov/legal-library/browse/cases-proceedings/x200041-202-3127-traffic-jam-events-llc-matter (cease and desist order bans Traffic Jam and David J. Jeansonne II from participating in any business relating to the advertising, marketing, promotion, distribution, or sale or leasing of motor vehicles).

[12] Gorsuch, J., concurring at 12. See also, Axon Enterprise, Inc. v. FTC, 598 U.S. 175, 204 (2023) (Thomas, J., concurring).

[13] Id.

]]>
What Updates to the Health Breach Notification Rule Mean for Your Business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/what-updates-to-the-health-breach-notification-rule-mean-for-your-business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/what-updates-to-the-health-breach-notification-rule-mean-for-your-business Mon, 01 Jul 2024 11:00:00 -0400 On July 29, 2024, the FTC’s revised Health Breach Notification Rule (HBNR) takes effect. The Rule requires vendors of personal health records (PHRs) and related entities not covered by HIPAA to notify individuals, the FTC, and in some cases, the media in the event of a breach of unsecured personal health data. Businesses operating a wide array of services, including health, diet, and fitness apps should take care to review the revised HBNR and assess its applicability to their practices.

Background

Since the original Rule was issued in 2009, the use of health-related apps and other direct-to-consumer technologies collecting health information (e.g., fitness trackers and wearable blood pressure monitors) has proliferated. In September 2021, the FTC issued a “Policy Statement” reinterpreting the scope of the HBNR and signaling the FTC’s intent to treat these new products and technologies as covered by the Rule. The updated HBNR, finalized on April 24, 2024, formalizes this expansion of the HBNR as envisioned in the Policy Statement.

Who Is Subject to the HBNR?

The HBNR applies to entities offering or maintaining personal health records not covered by HIPAA. It covers vendors of PHRs, PHR-related entities, and third parties providing services to these entities. However, distinguishing among these categories can be challenging. Here are a few examples:

  • Vendors of PHR. Companies providing online platforms or mobile apps that allow consumers to create comprehensive health records by storing and managing their health information from multiple sources are likely vendors of PHR. Examples include fitness tracking apps, diet and nutrition apps, and mental health apps that integrate data from the user and other sources, such as wearable devices or purchase histories stored with retailers.
  • PHR-Related Entities. Businesses offering devices like remote blood pressure cuffs or internet-connected glucose monitors may qualify as PHR-related entities when users sync this health information with another health app.
  • Third Party Service Providers. Third party service providers are roughly equivalent to processors or service providers. Businesses offering data security, advertising, or analytics services, for example, to a PHR vendor or a PHR-related entity with access to unsecured PHR data are considered third party service providers under the HBNR.

Businesses should be aware that where they fall under these categories depends heavily on the specific practices at hand, and they may move from one category to another. For instance, a third party may be considered a PHR-related entity if it offers services to a health app for its own purposes, such as research and development or product improvement. Similarly, a device manufacturer may be a PHR-related entity if it syncs health information with a third-party health app, but a vendor of PHR if it syncs health information with its own app (while integrating data from multiple sources).

What’s New

  • Covered Health Care Providers. “Covered health care providers” constitute one category of the sources of PHR individually identifiable information (other categories of sources are employers and HIPAA-covered entities). The Rule expansively defines covered health care providers to include “any online service such as a website, mobile application, or internet-connected device” that supports the tracking of consumer health indicators, like fitness, sleep, mental health, and vital signs. Under the revised Rule, mobile apps are now considered covered PHR-related entities when they integrate with other devices or services, such as geolocation functions, calendars, or third party data, linking them to the user’s PHR. Crucially, the Rule only applies to services with the capacity to draw information from multiple sources. For example, businesses that integrate fitness data into third-party sleep apps could now be considered PHR-related entities under the Rule.
  • “Breaches” Subject to the Rule. The revised Rule expands the scope of a “breach of security” to include any disclosures, including sharing or selling, of unsecured PHR that are not authorized by the individual. Covered providers should be mindful that this updated definition goes well beyond a traditional cybersecurity incident. For example, businesses that collect PHR for a legitimate purpose and subsequently share or use that information in a way not expressly authorized by the individual may have committed a “breach” under the new definition.
  • Timing of Notice to the FTC. Businesses covered by the HBNR that experience a breach involving the unsecured data of more than 500 individuals now have sixty days to notify individuals and the FTC of the incident. This change will be helpful to businesses, particularly larger entities, as the previous Rule required notifications to be sent within ten days, regardless of the size of the data breach.
  • Expanded Use of Electronic Notice. PHR vendors and related entities that discover a breach of security must provide written notice to individuals. Updates to the Rule now allow for the use of email and other electronic means to notify consumers of a breach, including text messages, in-app messages, and website banner messages. Additionally, breach notices must now include a brief description of the measures businesses are taking to protect affected individuals.

Why It’s Important

App developers and other companies providing health, wellness, fitness, and related apps should consider these updates to the HBNR, assess its applicability to their business, and comprehensively review their notification obligations in the event of an unauthorized disclosure. Firms offering data security, cloud computing, advertising, or analytics services to health apps should also review their potential obligations as third party service providers.

Summer Associate Joe Cahill contributed to this post.

]]>
Telemarketing in 2024 – A Mid-Year Review https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/telemarketing-in-2024-a-mid-year-review https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/telemarketing-in-2024-a-mid-year-review Thu, 27 Jun 2024 11:00:00 -0400 As we approach the 2024 halfway mark, businesses that rely on texting and calling to promote their products and services face an onslaught of new and significant legal and regulatory developments. To help with tracking these developments all in one place, below we summarize key telemarketing law developments and corresponding timelines to keep in mind:

  • 1:1 Consent – At the end of 2023, the FCC adopted an amendment to the definition of “prior express written consent” under its TCPA rules to require that a consumer give specific consent to be contacted by a particular seller for marketing purposes, and that such consent must be “logically and topically related” to the context in which it was obtained. This rule will officially go into effect on January 27, 2025, but we have seen a trend among service providers in the industry (particularly calling and texting platforms) requiring that their customers implement 1:1 consent well ahead of that deadline (and make corresponding changes to their privacy policy about sharing consent data with third parties). It would be prudent for affected businesses to take this time to carefully review their opt-in processes and privacy policies to assess what changes are necessary from both a commercial and legal compliance perspective.
  • AI and the TCPA – On February 8, 2024, the FCC voted unanimously in favor of a Declaratory Ruling that classifies AI-generated voices on robocalls as ​“an artificial or pre-recorded voice” under the TCPA. This means that calls using AI technology to generate a simulated or pre-recorded human voice must satisfy the TCPA’s consent requirements (including prior express written consent for marketing calls using AI). While the FCC focused the ruling on the common use and accessibility of AI-generated voices by bad actors to perpetrate fraud and spread misinformation, the development underscores the heightened regulatory scrutiny on a business’s use of AI to mimic human behavior for marketing purposes. The FTC also outlined in a recent blog post some of the potential consumer protection and privacy concerns that can arise from the use of AI chatbots to interact with consumers.
  • Expanded Opt-Out Rules – On February 15, 2024, the FCC adopted a Report and Order and Further Notice of Proposed Rulemaking to amend its TCPA rules and clarify the ways in which consumers can revoke consent to receive calls and texts. Among the changes were the adoption of various “per se” reasonable methods for revoking consent, including by texting the words ​“stop,” ​“quit,” ​“end,” ​“revoke,” ​“opt out,” ​“cancel,” or ​“unsubscribe.” The FCC also made clear that businesses cannot prescribe a particular method for revoking consent, and must honor reasonable opt-out requests within 10 business days. Importantly, while businesses are permitted to send a one-time text to clarify the scope of a consumer’s opt-out request if that consumer has previously consented to receive multiple types of messages, if the consumer does not respond to that message, they are presumed to revoke consent for all further non-emergency communications. The effective date for the amended revocation of consent rule is still uncertain, as it is undergoing a review by the Office of Management and Budget. Once that review is complete, the FCC will issue a notice, and the rule will be effective six months thereafter. Businesses can prepare for this change by evaluating and testing their technology and processes to confirm they can honor opt-outs in accordance with the new requirements.
  • Telemarketing Sales Rule Changes – Looking beyond regulatory changes at the FCC, the FTC announced in March a significant update to the Telemarketing Sales Rule, most notably by expanding parts of the rule to business-to-business calls, and expanding the scope and timeline of recordkeeping obligations for telemarketers. These amendments generally became effective on May 16, 2024, except for the “call detail” records subsection, for which the FTC had previously announced a 180-day grace period to give affected businesses time implement systems, software, or procedures necessary to comply. As such, businesses will have until October 15, 2024 to adhere to that particular provision of the rule.
  • New and Updated State Telemarketing Laws. A number of recently-enacted state laws related to telemarketing have taken effect (or will take effect) in 2024, including:
  • Maryland – The “Stop the Spam Calls Act of 2023” became effective on January 1, 2024. Key provisions of the new law include a requirement for “prior express written consent” for telephone solicitations that involve “an automated system for the selection or dialing of telephone numbers,” as well as call time and frequency restrictions similar to those adopted in other states, and a private right of action for alleged violations. To date, we are not aware of any private litigant bringing forward an action in court under the new law.
  • Maine – Earlier this year, Maine adopted a first-of-its-kind amendment to its telephone solicitation law that requires solicitors to scrub against the FCC’s reassigned number database prior to initiating a call. While limited in scope due to underlying exemptions in the statute, the requirement will become effective on July 16, 2024.
  • Georgia – Several changes to an existing telemarketing law in Georgia were recently enacted, including: (1) eliminating the requirement for a “knowing” violation of the law to pursue enforcement; (2) extending liability for calls made “on behalf of any person or entity” in violation of the law; (3) allowing private plaintiffs to pursue claims for violations as part of a class action with no limitation on damages; and (4) creating a safe harbor defense for solicitations made to a consumer “whose telephone number was provided in error by another subscriber” if the caller “did not know, or have reason to know, that the telephone number was provided in error.” These amendments will become effective on July 1, 2024.
  • Mississippi – By a series of amendments to its existing telephone solicitation law, Mississippi severely restricted the ability of businesses to contact consumers by phone about Medicare Advantage plans (unless a consumer first initiates a call to the business about such plans), and effectively banned telemarketing for Medicare supplement plans. These restrictions are unique among state telemarketing regulations because they are narrowly focused on calls about certain Medicare plans, and may be challenged on First Amendment grounds. In the interim, however, the restrictions will take effect on July 1, 2024.

If you have any questions about how these developments may affect your business, please contact Alysa Hutnik or Jenny Wainwright. For more telemarketing updates, subscribe to our blog.

]]>
Attorney General Alliance Meeting Recap: Focus on Director Chopra’s Remarks https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/attorney-general-alliance-meeting-recap-focus-on-director-chopras-remarks https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/attorney-general-alliance-meeting-recap-focus-on-director-chopras-remarks Tue, 02 Apr 2024 12:00:00 -0400 Last week, state attorneys general (AGs) gathered to discuss Nevada Attorney General and Attorney General Alliance Chair Aaron Ford’s Initiative, focusing on consumer protection education. Attendees heard from many panels discussing topics ranging from consumer financial literacy, digital literacy, and cybersecurity, to the continued hot topic of AI. We are highlighting the fireside chat between AG Ford and CFPB Director Rohit Chopra, as they discussed a variety of important topics and collaboration with State AGs.

Cooperation and Role of State AGs

Director Chopra began by complimenting the State AGs for their important consumer protection work, including by ringing the alarm bells on the foreclosure crisis. While he discussed the roles of states throughout his remarks, he emphasized that states and the CFPB need to work together. He reminded the AGs that early on, the CFPB published a procedural rule clarifying that State AGs can bring suit under the CFPB’s organic statute including a whole host of consumer credit and data laws, which several states have utilized. Director Chopra said the CFPB is looking for ways to collaborate with more states, noting they have been able to collect billions in penalties that can be used for consumer redress even in unrelated cases to make victims whole. Director Chopra asked that consumers send complaints to both State AG offices and the CFPB, because their consumer complaint system immediately routes complaints to the financial company at issue to get a response and resolution without staff intervention. The CFPB also is able to use data visualization tools to provide states with information about hot issues in different regions. The CFPB has used complaint data in collaboration with states to work on medical debt and other collection cases. Director Chopra said the CFPB is always looking upstream to identify warning signs to avoid future crises like the one caused by subprime mortgages.

Consumer Data and Security

Director Chopra explained that President Biden’s executive order on protecting sensitive personal data highlights a broad bipartisan interest in stopping the bulk transfer of consumer data. He explained that State AGs can work alongside the CFPB to enforce the Fair Credit Reporting Act, not only against the big three reporting companies, but also against data brokers. He noted the CFPB will propose additional rules to require data brokers to adhere to accuracy standards and otherwise protect consumer data. Director Chopra described categories of data and lists that brokers can purchase about vulnerable consumers, and his concern that there be a way for these people to be able to participate in the digital world without sacrificing privacy and security. He pointed to state laws requiring additional privacy and security such as Washington’s recent My Health My Data Act, and said he supported the fight against preemption of state privacy laws.

Big Tech and AI

Director Chopra reminded the audience of big tech’s efforts to become payment processors, providing them consumer transaction data. He noted these payment methods were used as a vector for imposter fraud, specifically citing the DOJ and states’ March lawsuit against Apple. Director Chopra explained the CFPB has recruited additional technologists with knowledge of user interfaces and design, and the agency has hosted enforcer roundtables with states to discuss issues with AI and technology, including how to draft CIDs.

On AI, Director Chopra said the CFPB is looking at marketing and advertising for discriminatory or manipulative AI. They are also reviewing how loans are being written, because if AI cannot explain why it denied credit to a person, it is a violation of federal law that requires explanation for denial. Director Chopra also said chat bots are another form of AI used by banks for customer service. He alluded that it could be considered deceptive to use human names and “…” thinking signals to simulate human activity. Director Chopra said he wants to see institutions affirmatively describe these chat bots as robots and ensure the bots do not provide inaccurate information or a poor customer service experience.

Bank Relationships

Director Chopra said the CFPB’s work has shifted some from mortgage lending issues with banks into non-banks. He said they have also heard from the AG community that national chartered banks have not cooperated on investigations, claiming preemption. Director Chopra said when that happens, the CFPB will work with the states to obtain the information themselves. His expectation is that banks work with the states to ensure consumers are protected.

Junk Fees

As a former businessperson himself, Director Chopra said pricing consultants he encountered in the past left a big impression on him. He noted that industries such as air travel, event ticketing, and banking have made it difficult to compare pricing resulting in reduced competition. He described certain bank fees such as a paper statement fee (when nothing is printed and no paper is sent) as “fake fees,” and harkened back to past CFPB actions against banks reordering payments to trigger multiple overdraft fees. Director Chopra also said that some credit card issuers created a business model based on rooting for people to be late, causing late fees. The CFPB has proposed rules to close what he described as a loophole in the credit industry, stating people understand they will have to pay interest but do not understand the other layers of fees they may not be able to control. Director Chopra also pointed to potential concerns with credit card reward “bait and switch” offers as a core truth in advertising concept. Though the CFPB is using rulemaking and enforcement actions to combat junk fees, Director Chopra also gave credit to the business community for taking initiative to become more upfront and transparent.

Stay tuned as our team will be hearing more from the State AG community on AI and other tech topics in less than two weeks at the NAAG and AGA’s Southern Regional Meeting/Artificial Intelligence and Preventing Child Exploitation Seminar.

]]>
Health Data Privacy: What We’re Hearing https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-privacy-what-were-hearing https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-privacy-what-were-hearing Tue, 12 Mar 2024 09:00:00 -0400 U.S. privacy developments are moving quickly, but health data privacy is racing forward. Companies that come into contact with consumers’ health data need to track and respond to a variety of developments. Most notably, these include Washington’s My Health My Data (MHMD) Act, a similar law in Nevada, “sensitive data” and “sensitive personal information” requirements under comprehensive state privacy laws, and FTC enforcement actions and guidance that assert that a broad range of health data is sensitive. How a company responds to these developments is likely to be iterative given the lack of clarity or harmonization with these requirements, and substantial resources required to implement changes.

In terms of what is visible, regulators expect companies to make detailed, specific disclosures and obtain opt-in consent for most health data collection, use, and sharing. Getting these disclosures and consents right, however, requires a lot of preparatory work, starting with identifying health data that a company controls.

A few steps can be helpful in managing this uncertainty. First, adopting a framework to classify health data will lead to greater consistency and efficiency in this cornerstone compliance activity. Second, taking a clear-eyed view of the difficulty of obtaining a consent will help set realistic business expectations for the use of health data in this challenging regulatory environment. Third, documenting a health data privacy program will help to maintain the program over time and demonstrate compliance to regulators and commercial partners.

Framework for Data Classification: Is It Health Data?

For many companies, determining whether they process health data, and which elements of their data inventories constitute health data, is a daunting task. The exercise can involve reviewing thousands of variables, segments, or personal data elements.

At present, there is little regulatory guidance and no common taxonomy of health data definitions, making it difficult to benchmark. In addition, health data definitions vary across state and federal regulators, and adopting a national approach based on the broadest definition might be infeasible from a business perspective.

A few strategies can help manage the uncertainty:

  • Work from explainable factors. Using factors that capture the overlap among different health data definition will be helpful in establishing consistent classifications and educating business stakeholders about when they’re encountering health data. Factors that are based on the current range of health data definitions include:
    • Does the data reveal a specific health condition?
    • Does the data reveal a past or present health condition?
    • Does the data relate to a specific consumer?
    • Does the data relate to a sensitive health matter?
    • What kinds of harm (if any) could use or disclosure of the data reasonably cause to an individual.
    • Particularly important for Washington and Nevada: Does the data relate to a consumer’s past, present, or future health status?
  • Think holistically about classification. Classifying health data in a vacuum can lead to trouble. Regulatory definitions are broad, and it may be insufficient to analyze a data element on its own. Rather, it may be necessary to consider the purpose of using or disclosing a specific data element bears on whether it is health data. It’s also possible that a data element is “health data” when under the control of one entity but not another. Understanding the business processes, contractual commitments, data sources, and other factors surrounding potential health data is therefore critical. In many cases, there won’t be a clear answer to whether a data element is health data. Being able to identify what’s clearly in, and out, of this category allows businesses to devote more time to genuinely debatable cases.
  • Think about scalability and sustainability. A one-time classification effort, even if it encompasses thousands of variables, might be feasible for many companies. Maintaining these classifications over time is another story. For companies with relatively static data inventories, maintenance over time is likely less challenging. When inventories change quickly, however, a case-by-case review of data elements might be impractical. Consider setting a cadence for review and how one might designate privacy champions within the business to apply the framework on an ongoing basis, in coordination with legal support.

There’s Consent, and Then There’s MHMD Consent

While the FTC and states with comprehensive privacy laws are moving toward requiring opt-in consent for most health data processing, MHMD creates particularly stringent consent requirements. The difference between MHMD and other health data regulations lies not in the action required for consent – it must be voluntary and unambiguous – but in the narrow scope of consent that is permissible and the details that must be disclosed to make the consent informed. (Although other regulators have not been as explicitly restrictive, there is a clear trend in this direction, as we discussed in our recent posts on the FCC’s one-to-one consent order.)

Specifically, a business must disclose the following to obtain consent to collect or share consumer health data:

  1. The categories of consumer health data collected or shared;
  2. The purpose of the collection, including the specific ways in which it will be used;
  3. The categories of entities with whom the consumer health data is shared; and
  4. How the consumer can withdraw consent from future collection or sharing of consumer health data.

Meeting these standards might be infeasible for many businesses, particularly those that do not have direct relationships with consumers.

MHMD’s requirements to sell consumer health data are even more stringent. The law requires a valid authorization, which must include the name and contact information of the purchaser, be signed by the consumer, and expire within one year of signatures, among other requirements. Obtaining an authorization outside of limited circumstances is unlikely to be practical for most companies.

The main alternative to consent or authorization is to restrict collection of health data under MHMD to what fits under the necessity exception. Washington has not provided further guidance on the scope of this exception, but we expect regulators to interpret this exception narrowly.

Documentation is Key

We get it: companies are reluctant to create discoverable documents that might be used to prove that they misinterpreted health data regulations. The alternative, however, is far worse and could be used to support the argument that a company systemically failed to govern health data in a reasonable fashion. It can also lead to inconsistent practices within a company and time-consuming back-and-forth between business and legal teams.

Key documents include health data definitions, consent requirements, partner diligence processes, data subject request procedures, and model contract terms. Many of the consumer health data practices that should be documented are likely extensions of current privacy programs and processes, such as data protection assessments.

Of course, some discussions warrant protection under attorney-client privilege. Maintaining clear lines between discussions that provide legal advice and operational guidance to business teams will help draw defensible lines around privilege.

* * *

The acceleration in health data privacy regulation is adding to demands to privacy teams that are already stretched thin. Confronting the breadth of “health data” definitions and the impact of these regulations on business operations in the absence of regulatory guidance is especially challenging. For better or worse, the boundaries of health data privacy regulations will be clarified through enforcement and MHMD’s private right of action. In the meantime, understanding these laws’ core purposes and keeping a close watch on statements from regulators will be helpful starting points to setting compliance priorities.

]]>
New Jersey and New Hampshire Set the Pace with 2024 State Privacy Legislation https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-jersey-and-new-hampshire-set-the-pace-with-2024-state-privacy-legislation https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-jersey-and-new-hampshire-set-the-pace-with-2024-state-privacy-legislation Thu, 08 Feb 2024 09:30:00 -0500 New Jersey and New Hampshire are the first states out of the gate in what promises to be another busy year in state privacy legislation.

On January 16, New Jersey Governor Phil Murphy signed the New Jersey Data Privacy Act (NJDPA), making the Garden State the first to enact a comprehensive privacy law in 2024. New Hampshire is set to follow after Senate Bill 255 (SB255) passed the legislature on January 18 and is awaiting the governor’s signature. Both pieces of legislation closely resemble frameworks established by other U.S. states, such as Colorado.

What is Notable about NJDPA and SB255?

The NJDPA and SB255 are largely consistent with other key state comprehensive privacy laws. However, there are some noteworthy differences in the two laws:

New Jersey

NJDPA does not exempt nonprofit organizations and lacks specific revenue thresholds seen in most other comprehensive privacy laws (the main exception is Colorado). New Jersey is also the first state to require consumer consent before processing financial information. Financial information is defined to include “a consumer’s account number, account log-in, financial account or credit or debit card number, in combination with any required security code, access code or password that would permit access to a consumer’s financial account.”

Further, New Jersey joins only two other states (California and Colorado) in authorizing rulemaking to implement its privacy law, including regulations about opt-out signals. The law does not set a deadline to adopt regulations, but the degree of harmonization with other state regulations will bear close watching.

New Hampshire

New Hampshire’s pending bill contains a low applicability threshold. As demonstrated in the below table, SB255 will apply to entities that control or process the personal data of only 35,000 New Hampshire residents, or 10,000 New Hampshire residents if the entity derives more than 25% of its gross revenue from the sale of personal data. Thus, New Hampshire’s pending privacy bill is significant for its potential reach on businesses that may process such a relatively minimal amount of state resident personal data.

Click here to see how NJDPA and SB255 stacks up to other key state comprehensive privacy laws.

Looking Ahead

Although New Jersey and New Hampshire have many similarities to other comprehensive privacy laws, both laws add a layer of complexity to the greater state privacy landscape. As state privacy laws proliferate, keeping an eye on differences among them will be helpful in keeping privacy compliance programs up to date.

]]>
State AGs Focus on Social Media and its Impact on Youth https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-focus-on-social-media-and-its-impact-on-youth https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-ags-focus-on-social-media-and-its-impact-on-youth Wed, 07 Feb 2024 08:00:00 -0500 As we have discussed, the NAAG President, Oregon Attorney General Ellen Rosenblum, formally announced her 2024 Presidential Initiative focusing on America’s youth. As we noted, this was consistent with a broader priority for 2024 among many state attorneys general (AGs) across the country.

Last week we saw a flurry of activity from State AGs relating to children’s privacy and safety in advance of the January 31, 2024, Senate Judiciary Committee hearing on Big Tech and the Online Child Sexual Exploitation Crisis. Related to the hearing, Florida Attorney General Ashley Moody called on Congress to push child-safety initiatives and hold Big Tech companies that target our kids "accountable for the problems they have created.”

California Attorney General Rob Bonta and members of the California legislature announced two bills: the Protecting Youth from Social Media Addiction Act (SB 976), and the California Children’s Data Privacy Act (AB 1949). These bills were described as “seeking to protect youth online” and touted as part of Attorney General Bonta’s ongoing efforts.

SB 976 describes the legislative intent as acting to ensure platforms obtain parental consent before exposing children to “addictive” features. Among other things, it:

  • Bans social media operators from providing a so-called “addictive feed” unless privacy settings are available to detect minors, the user selected the information themselves, or it involves private communication between users, among other exemptions
  • Requires verifiable parental consent for minors under 18 to use such a feed
  • Sets specific restrictions on notifications by default during what is described in the press release as nighttime and “school hours” without parental consent, among other notification and access default restrictions
  • Requires operators to provide settings to limit the ability to view numbers of likes or other user feedback and set child accounts to private by default
  • Requires operators, who continue to provide services to minors, to provide the same standards, without altering the price for users taking advantage of the bill’s rights
  • Requires ongoing reporting regarding statistics of minor users and number of parental consents
  • Provides the AG the ability to adopt further regulations, especially pertaining to age verification and parental consent

AB 1949 proposes to amend the California Consumer Privacy Act to ban businesses from collecting, selling, or sharing the personal information from users under 18 without either informed consent of a teen or a parent for younger children. The amendment would also require the California Privacy Protection Agency to adopt rules relating to child-specific opt-outs and age verification.

Last week, we also saw Nevada announce yet another AG-led action in its suit against five social media platforms alleging they were designed to addict children. The causes of action include violations of the state’s UDAP law, as well as negligence, unjust enrichment, and product liability claims. Nevada followed other states in bringing only state court claims in an individual state case, contrasted with the multistate coalition of 33 states that filed suit together in October 2023 in federal court asserting violations of COPPA against one of the platforms, in addition to violations of state law.

While the impact of these recent bills and actions may seem limited to a few big players, companies should take note that State AGs are willing to take aggressive action against any businesses they believe are taking advantage of youth. If children or teens are part of your key demographic, you should look closely at how your data practices, advertising, or services impact your audience. The AGs will be looking closely, too.

]]>
Top 5 State Privacy Issues We’re Monitoring This Year https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-5-state-privacy-issues-were-monitoring-this-year https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-5-state-privacy-issues-were-monitoring-this-year Tue, 09 Jan 2024 12:00:00 -0500 The year ahead promises to be busy on the state privacy front. As we’ve covered on this blog, states are continuing to fill the gap at the federal level by implementing comprehensive state laws that guarantee consumer privacy rights and regulate data sales, targeted advertising, and sensitive data.

Now, more states than ever are jumping on the bandwagon with comprehensive privacy laws on the books in more than 25 percent of U.S. states and new legislative efforts underway in many other states. In 2024, new laws in Florida, Tennessee, Texas, and Oregon will take effect, joining laws already in effect in California, Colorado, Connecticut, Utah, and Virginia. Laws focused on consumer health data take effect in Washington and Nevada in March as well.

As we look ahead to 2024 activity, we’re particularly monitoring for developments in the following five areas as the year unfolds:

1. ENFORCEMENT

  • For the most part, state privacy law enforcement has taken place outside the public view in the form of informal inquiries, consumer complaints, and enforcement sweeps. Public enforcement actions in 2024 will no doubt garner attention and demonstrate the types of issues that enforcement agencies are initially prioritizing under the new privacy laws. We will be watching closely for public enforcement actions stemming from the slate of new state privacy laws that have taken effect over the last year.
  • Since the California Consumer Privacy Act (CCPA) took effect on January 1, 2020, the only public state privacy enforcement action with monetary penalties was the California Attorney General’s August 24, 2022 action against Sephora, which settled for $1.2 million. In the matter, the California AG clarified its view that Sephora’s use of ad tech services involved a “sale” of personal information to these third parties. The case also placed a spotlight on the AG’s requirement that companies recognize browser-level opt-out requests via opt-out preference signals.
  • The California Privacy Protection Agency (CPPA), for its part, is investigating connected vehicles and related technologies but has not announced a public enforcement action. However, a court order delayed enforcement of the agency’s regulations until March 29, 2024, limiting the scope of the agency’s potential enforcement activity until then.
  • What to know: Where a state regulator announces priorities or enforcement, it’s worth paying attention to those developments and comparing to how your company’s compliance matches up. If there are areas for improvement on such priority topics, it would be prudent to address sooner rather than later.

2. TECHNOLOGY

  • We expect regulators, tech companies, and industry will continue to evaluate, adopt, and offer new technologies that facilitate compliance with privacy rights requests and privacy requirements in the coming year. These nascent technologies tie legal requirements to implementation (or monitoring for compliance) in a concrete way and are increasingly becoming the face of privacy laws.
  • Seven of the new state privacy laws require businesses to recognize and respond to a universal opt-out preference signal typically sent from the consumer’s browser. This signal allows a consumer to decide to opt out of the sale of their personal information for all websites, as opposed to making opt-out selections on a site-by-site basis.
  • Another area to watch: the CPPA’s new authority to develop a single mechanism that allows consumers to delete personal information held by data brokers registered with the state. Under SB 362 which passed the California legislature last year, the CPPA will be required to implement the new deletion mechanism by January 1, 2026, and in the intervening two years we expect to see the CPPA solicit feedback on, develop, and test this new deletion system.
  • What to know: The new privacy legal landscape requires a lot of resources, manpower, technical support, education, and time. As the year proceeds, and the baseline of knowledge and expectations increases, regulators are likely to be less empathetic to arguments that compliance is difficult. If you have not already done so, it may be time to implement a privacy risk matrix to make the case on why you need more internal resources to support your company’s compliance efforts (which has the added benefit of often supporting a durable data strategy).

3. CONSUMER HEALTH DATA

  • On March 31, 2024, the majority of the substantive provisions of the Washington My Health My Data Act take effect. As we’ve written about on the blog, the new law is much broader than it may seem, and has significant implications for companies that collect or process the broadly-defined category of “consumer health data.”
  • In particular, the Washington law includes a path for private plaintiffs to seek damages under the Washington Consumer Protection Act for practices “covered” by the My Health My Data Act. The legislature included a declaration that the law addresses “matters vitally affecting the public interest” and that violations of the law constitute an “unfair or deceptive act in trade or commerce,” which are required elements under the Washington Consumer Protection Act. A plaintiff would still be required to establish injury caused by a violation of the My Health My Data Act to bring a claim. This new enforcement mechanism, along with state AG enforcement, could lead to a proliferation of privacy actions in Washington in the coming year.
  • The law also reflects an increasing focus on the protection of sensitive data in state law. Whereas earlier iterations of state privacy laws traditionally focused on protecting social security numbers or financial account numbers, the new state privacy laws expand the definition of “sensitive” data to racial/ethnic origin, citizenship/immigration status, religious/philosophical beliefs, sex life, sexual orientation, biometric and genetic information, precise geolocation, and health information. This expanded set of sensitive elements is already requiring companies to reflect on the types of information that they collect, what information they share with other companies, and determine whether they need to modify business practices, including by collecting consent where necessary.
  • What to know: If you have not already done so, it would be well worth the time to classify what data your company processes that meets the definition of health data or sensitive data under these laws and map whether your data processing activities are in line with the new restrictions and requirements. Unfortunately, there is not industry consensus on a taxonomy for such data but given how much rides on correct classifications, it would be prudent to be proactive here.

4. ACCOUNTABILITY

  • Enforcement agencies are increasingly focused on promoting verifiable accountability for privacy practices, and we expect to see this trend continue throughout 2024.
  • The most prominent example is the due diligence requirement in the new CCPA regulations that takes effect in March. The regulations incentivize businesses to conduct due diligence on any service providers, contractors, or third parties to which the business transfers or sells personal information to ensure these partners comply with CCPA requirements, stating that a business that “never enforces the terms of the contract” nor “exercises its rights to audit” might forgo a defense that the business believed its partner complied with the CCPA. It’s a good reminder to account for a potential demonstration of one’s compliance, including documenting due diligence and compliance check-ins during the life of the relationship.
  • The CPPA is expected to conduct rulemaking into additional accountability measures: cybersecurity audit regulations, risk assessment regulations, and automated decision-making technology regulations. These draft regulations will create new assessments where businesses will be required to evaluate their privacy or security-related practices, document those practices, and in some cases provide transparency about such practices in the form of consumer-facing notices or regulatory filings. These details will be worked out over the course of 2024, with final regulations to take effect potentially as early as the first half of 2025.
  • The Colorado, Connecticut, and Virginia laws already in effect require data protection assessments when engaging in certain data sales, targeted advertising, processing of sensitive data, or profiling activities. The AGs in these states may request these assessments as part of enforcement activities, and we’ll continue to watch for any resulting public enforcement actions related to data protection assessments. Companies should consider how they’re maintaining attorney-client privilege and protecting attorney work product versus what are non-privileged portions of the assessments that they’ll need to produce in response to a regulator request (or which could become targets in litigation discovery requests).
  • California’s new data broker law, SB 362, will also increase data broker accountability this year through mandatory disclosures that must be published by July 1, 2024. Data brokers will be required to compile and publish metrics about privacy requests received during the previous calendar year. Data brokers should also re-register with the CPPA’s new data broker registry by January 31.
  • What to know: As accountability obligations mount, companies can build out internal resources, knowledge, and infrastructure that will be needed to address new compliance documentation requirements. For example, consider incorporating due diligence and data protection assessment triggers into your new contract intake process to help incorporate privacy reviews into routine operations. Also, consider what steps may be necessary to ensure ongoing monitoring of partners for compliance.

5. FURTHER DISRUPTION

  • So far, the state privacy laws have followed a predictable framework based on the consumer rights, accountability and transparency mechanisms, contract requirements, and regulation of data sales and targeted advertising first featured in the CCPA and refined in the Colorado, Connecticut, and Virginia privacy law templates.
  • However, this framework that has persisted in 13 states is ripe for disruption. Already, states like Massachusetts, Maine, New Hampshire, and others have considered bills that take unique legislative approaches, some modeled on stalled federal legislation, that are drafted differently from existing state privacy laws.
  • Harmonizing compliance with the current slate of 13 state privacy laws is possible in large part because legislatures have passed compatible legislation. Incompatible legislation would be a game-changer that would increase the challenge of compliance in a fractured legislative landscape.
  • What to know: As we’ve seen in the last few years, the only constant in state privacy law is change. The ongoing challenge is building long-term privacy compliance infrastructure in the face of shifting legal requirements. We will continue to monitor and bring you the latest updates on new laws that take effect over the course of the coming year, to help you plan as much as possible for the road ahead.
]]>
Top Advertising Law Developments in 2023 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-advertising-law-developments-in-2023 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-advertising-law-developments-in-2023 Fri, 22 Dec 2023 12:30:00 -0500 If you follow our blog, you already know that there have been a number of significant developments in the world of advertising law over the past 12 months. In this post, we highlight ten of those developments and consider what they might mean for the future.

  • Regulators’ Favorite Shade – Dark Patterns: Following the FTC’s 2022 Dark Patterns Report and high profile enforcement action against Epic Games, regulators including the FTC, CFPB, and state AGs continued to bring enforcement and provide guidance on perceived “dark patterns” – primarily related to automatic renewal and continuous service options, but also as to chat bots, disclosures, and marketing practices more broadly. In January, the CFPB released guidance focused on dark patterns in negative option marketing. In March, the NAD joined the discussion in a decision highlighting potential issues with Pier 1’s advertising of discounted pricing only available with a paid subscription and its use of a pre-checked box for enrollment with that same subscription. The FTC continued to lead the charge – with dark patterns allegations playing a key role in a number of enforcement actions, including against Publishers Clearing House, Amazon, and fintech provider Brigit.
  • Beyoncé and Taylor Swift Concerts Lead to War on Junk Fees: Okay, the war against junk fees may have predated the fees associated with the pop stars’ mega tours, but it continued in earnest throughout the year. As with dark patterns, the FTC, CFPB and state AGs all took on junk fees at various times. Most notably, the FTC proposed a far-reaching rule that could fundamentally alter how prices and fees are disclosed in businesses across the country. The comment period was just extended until February 7, 2024 for the proposed rule. Not to be outdone, California passed new legislation banning hidden fees and the Massachusetts AG issued draft regulations that would prohibit hidden “junk fees,” enhance transparency in various transactions, and make it easy for consumers to cancel subscriptions.
  • Endorsement Guides: In June, the FTC released its long-awaited update to the Endorsement Guides. We noted that the Guides include some significant changes, including new examples of what constitutes an “endorsement,” details about what constitutes a “clear and conspicuous” disclosure, and an increased focus on consumer ratings and reviews. We also examined how the revisions could affect influencer campaigns. In November, we reported that the FTC had sent warning letters to two trade associations and 12 influencers over their posts, giving us a glimpse of enforcement to come. Meanwhile, NAD has also been active in this space and even referred a case to FTC for enforcement. Expect this to be a priority for both FTC and NAD in 2024.
  • Green Guides: The FTC’s Green Guides review progressed this year with an initial comment period closing in April, followed by an FTC workshop on “recyclable claims,” which we attended and highlighted here. With its history of hosting several workshops on hot green topics, we expect to hear of more workshops in the new year. California has been active as well with the governor signing a new law in October that aims to regulate carbon claims and make businesses more transparent about their carbon reduction efforts by requiring certain website disclosures (see our summary of the law here). The effective date is the first of the new year, but according to a recent letter from the bill’s sponsor, we expect that California will defer enforcement until January 1, 2025 to give companies time to comply (see here). With ESG efforts continuing to be front and center for most companies, consumers and regulators are holding companies accountable for those claims by questioning messaging about their efforts, aspirations for the future, and basis for the claims (see, for example, here, here, and here).
  • Children’s Privacy: Congress, regulators, and advocates focused time and energy on children’s privacy issues in 2023. The House and Senate held hearings focused on children’s safety and privacy. Although the Senate Commerce Committee advanced the Kids Online Safety Act, it never received a floor vote; Senators Markey and Cassidy continued to advocate for approval of the Children and Teens’ Online Privacy Protection Act (COPPA 2.0). The FTC reached settlements with companies about practices it alleged violated the Children’s Online Privacy Protection Act (COPPA) on the Xbox and Alexa platforms and with edtech provider, Edmondo. In September, the FTC released a ​“Staff Perspective” on digital advertising to children, which included recommendations on how to protect kids from the harms of “stealth advertising.” Also in September, a federal court agreed with industry advocates that California’s Age Appropriate Design Act, which imposes a variety of obligations on businesses that provide online services “likely to be accessed by children,” violated the First Amendment. California is appealing the decision, and regulators, including a number of Attorneys General and FTC Commissioner Alvaro Bedoya, have joined the state as amici. One of the most anticipated developments occurred with just 11 days left in the year, when the FTC proposed revisions to the COPPA Rule—more than four years after initiating its review process. Among other things, the proposed Rule would require new, additional consents for third-party disclosures and could affect operators’ approach to “internal operations.” Online services with children’s audiences have lots to consider in 2024 and beyond. Stay tuned for further updates.
  • State AG: State Attorneys General continued to make their presence felt in 2023. State AGs continued to go after companies for using fake reviews and false endorsements, enforced and proposed new price gouging rules, pursued telehealth companies for deceptive practices, supported the FTC’s Negative Option Rulemaking while bringing their own auto-renewal actions, continued to impose significant penalties against companies for data breaches, pursued companies for misleading consumer financial practices, and focused efforts on so-called “junk fees.” But two topics continue to be the highest priority of AGs – the impact of developments in AI (which we’ve written about here, here, and here – just to name a few) and protecting the most vulnerable consumers – especially our nation’s youth. The incoming president of the National Association of Attorneys General president, Oregon Attorney General Ellen Rosenblum, has already made protecting youth, especially teens, this year’s presidential initiative. Look for AGs to continue to this focus well into 2024.
  • Automatic Renewal: While auto-renewal service sign-up flows remain important, this year, we have seen a transition to cancellation processes being the hottest topic as states enforce their specific requirements and the FTC has drawn attention to “click to cancel” through its proposed rule. But we shouldn’t forget all of the FTC’s other proposals under the negative option rule NPRM, including expanding the scope, requiring more specific disclosures, separate consent for negative option, consent for save offers, and expanded notice requirements. Regardless of whether a federal rule formally comes into play in 2024, as referenced above certainly states have agreed are on board with FTC’s proposals, and they also resolved a multistate investigation this year requiring checkbox consent, online cancellation, and limiting save attempts. And don’t forget Massachusetts is working on its own rulemaking involving online cancellation.
  • NAD: This year, NAD issued number of decisions that caught our attention. For example, a decision in February narrows the scope of what claims may be considered puffery. NAD later elaborated on what it thinks advertisers must do in order to substantiate aspirational claims about future goals. NAD also issued a number of decisions involving endorsements – including employee endorsements and disclosure requirements – and even referred a case to FTC for enforcement. In August, NAD held that emojis could convey claims, though NARB later disagreed with how NAD had applied that principle. As always, NAD plays a big role in the advertising law landscape, so companies will want to continue to watch what NAD does in 2024.
  • Same Product/Different Label Litigation: We chronicled a Connecticut district court’s denial of a motion to dismiss in a case in which the plaintiff alleged that Beiersdorf, maker of Coppertone sunscreens, engaged in false advertising by selling the same sunscreen formula in two different packages, one of which was labeled as “FACE” and sold in a smaller tube at twice the price of the regular Coppertone Sport Mineral sunscreen. That case is one to watch but it is not the only one of its kind. In fact, 2023 saw several similar cases involving allegedly the same formula marketed as different products with varying price points, such that the plaintiffs alleged that they were misled into purchasing the more expensive item because they believed it was uniquely suited to their needs when, in fact, it was the same as the lower-priced item. These cases involved a range of products, such as baby/adult lotions, infant/children’s acetaminophen, children’s/adult cold remedies, to name a few. So far, decisions are mixed, with some courts being more willing than others to find that the differing prices were justified. Marketers of food and personal care brands that merchandise the same formula in varying iterations will want to remain mindful of these cases as they update packaging and claims.

Keep following us in 2024, and we’ll keep you posted on how these trends develop. In the meantime, have a great holiday!

]]>
CPPA to Propose Changes to Privacy Policy Requirements https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cppa-to-propose-changes-to-privacy-policy-requirements https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cppa-to-propose-changes-to-privacy-policy-requirements Tue, 12 Dec 2023 10:00:00 -0500 While the California Privacy Protection Agency (CPPA) Board’s attention during its December 8 public meeting was mainly focused on preliminary draft regulations on automated decisionmaking technology (ADMT), risk assessments, and cybersecurity audits, the Board also decided to begin a formal process to revise its existing regulations.

The proposed changes emphasize the need to give consumers a “meaningful understanding” of personal information practices and the CPPA’s focus on providing information about data practices before consumers engage with a business. These changes are less far-reaching than the ADMT, risk assessment, and audit proposals, but they could affect how businesses make disclosures in their privacy policies and are likely to be finalized on a relatively short timeline.

Here are the key ways that privacy policy requirements would change under the CPPA’s proposal.

  • “Meaningful Understanding” of Sources and Third-Party Recipients of Personal Information

The draft revisions to sections 7011(e)(1)(B) and (E) would expressly include a requirement for privacy policies to give consumers a “meaningful understanding” of the sources from which a business collects personal information and the categories of third parties to which it sells or shares personal information. The phrase “meaningful understanding” is already in the current definitions of “categories of sources” and “categories of third parties” in section 7001. Its repetition in section 7011 could signal an expectation of increased specificity and clarity in how businesses collect and sell personal information.

  • Clarifying Disclosures to Service Providers and Contractors

Proposed revisions to section 7011(e)(1)(H) would require businesses to identify the categories of personal information that they disclosed to a service provider or contractor in the preceding 12 months, along with the business purpose for these disclosures. This change would remove an ambiguity in current section 7011(e)(1)(H), which also mentions disclosures to third parties for business purposes, which is arguably inconsistent with the definition of a third party. Companies that have interpreted subsection (H) differently may need to take another look at their privacy policies in light of this proposal.

  • Privacy Policy Links for Mobile Applications

Finally, the draft regulations propose to require mobile applications to include a link to their privacy policies within their settings menu. Under current section 7011(d), including a privacy policy link in an app’s setting menu is discretionary. This new requirement would be in addition to the current mandate to make the privacy policy available through the business’s homepage or app store download page.

What’s Next?

Once CPPA staff revises the draft revisions to reflect Board members’ input, the package of rule changes will be published for a 45-day public comment period. The CPPA did not indicate when the comment period will begin.

]]>
NAAG CP Fall 2023: Dark Practices? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/naag-cp-fall-2023-dark-practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/naag-cp-fall-2023-dark-practices Fri, 10 Nov 2023 11:00:00 -0500 On November 7, the National Association of Attorneys General (NAAG) 2023 Consumer Protection Fall Conference held its public day with a number of substantive and interesting discussions about the current state of consumer protection enforcement by the FTC and, of course, state AGs. We will be bringing you a series about the conference panels, each of which discusses a hot topic of enforcement for the coming year.

First, Attorneys General Kwame Raoul of Illinois and John Formella of New Hampshire kicked off the day with a discussion on “Dark Practices Impacting Consumer Privacy,” focusing on dark patterns with our very own Alysa Hutnik from Kelley Drye and Ben Wiseman, Associate Director of the Division of Privacy and Identity Protection with the FTC. We have previously covered dark patterns here and here and how they relate to typical UDAP claims; that sentiment was echoed throughout the morning’s panel.

AG Views

Attorney General Raoul started off self-deprecatingly, describing himself as an average consumer in regard to tech savviness, impacting his perspective regarding convenience versus vulnerability when it comes to consumer privacy and dark patterns. Attorney General Formella mentioned the struggle with trying to avoid becoming a nanny state while still protecting consumers, and compared some of the practices on the internet to having someone in a department store physically follow a person around watching everything they look at or nudging them to buy things. He posited that what we are talking about today on the internet is very similar but “in a more devious way.”

Defining Dark Patterns

Wiseman defined dark patterns for the audience and described the four types of dark patterns outlined in the FTC’s recent report, calling guarding against dark patterns the core of consumer protection and unfairness. Hutnik elaborated that the common theme in the FTC’s report are common unfair and deceptive acts and practices concepts – failing to conspicuously disclose material information, but noted the struggle in identifying what is material to consumers and when there may be an adverse impact. Importantly, businesses should consider whether their practices would frustrate or surprise a consumer in a negative way.

Data Collection

General Formella moved the discussion, asking what type of data companies are collecting and what they are doing with the data. Hutnik explained that companies want positive and continuing relationships with their customers/consumers and to provide them with what they are interested in. Businesses also need to account for the obligation to be clear about their data practices, and any data sharing needs to address privacy obligations. Wiseman agreed that there can be benefits to certain data collection practices, but as consumers increasingly transfer their lives online, there is an increase of sensitive information and monetization of data without much regulation. General Raoul asked about consumer attitudes on data privacy and notices, and Wiseman said several recent studies have shown consumers are not satisfied with the current notice and consent regime. However, Hutnik noted that companies are making significant changes in response to recent state comprehensive privacy laws, and consumers are showing up with loyalty for companies that are upfront and responsible with consumer data, with user-friendly privacy options available to the consumer. Hutnik also pointed out the significant investment some of these businesses have been making in data management infrastructure to be able to comply with these laws and deliver a better user experience for consumers.

As far as privacy goes, Wiseman said that the FTC is looking increasingly to unfairness to handle these issues. But Hutnik responded that not every state has unfairness, but with comprehensive privacy laws cropping up in so many states, the challenge for businesses is looking ahead and building data strategies that account for the regulatory trend line. Attorney General Formella chimed in that New Hampshire is on the verge of passing their own comprehensive privacy law, with the legislature already committing resources to enforce.

FTC Enforcement

General Formella asked about the FTC’s priorities regarding enforcement in the area. Wiseman responded that the dark patterns report shows a lot of examples. During the panel, he referenced Vonage, BetterHelp, GoodRX, and Epic Games as recent examples of FTC enforcement in the area of dark patterns. While most people recognize Epic Games as a COPPA case, Wiseman specifically pointed to the design choices regarding unauthorized charges of “V-bucks” where the company changed its “undo” button to be less prominent after testing revealed it would reduce consumer clicks. He also discussed the state AGs’ Google location tracking practices matter which Wiseman said included dark patterns in hiding material information and inducing false beliefs regarding settings.

Takeaways

As Hutnik explained, businesses should keep the following in mind when it comes to dark patterns and privacy:

  • Tell consumers the truth and they tend to come back. It’s a long game.
  • Monitor consumer complaints for trends to determine whether you have an unintentional design issue that is leading to consumer frustration.
  • Consider whether you truly need sensitive information for your business, and in any event align any collection and use with consumer expectations.

And from Wiseman’s perspective, businesses should look to these resources to assist with compliance:

  • 2022 Dark Patterns Report, which includes an appendix of prior enforcement
  • Enforcement orders as well as complaints, because they give insight into how the FTC views practices and where they may become unlawful
  • Current rulemaking efforts related to dark patterns: Negative Option and Junk Fees

To sum it up, both Hutnik and Wiseman agree that businesses should compete based on privacy for the benefit of consumers.

]]>
Health Data Coding Error Costs Inmediata $1.4 Million with AGs https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-coding-error-costs-inmediata-1-4-million-with-ags https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-data-coding-error-costs-inmediata-1-4-million-with-ags Wed, 25 Oct 2023 09:00:00 -0400 We posted just last week about the Blackbaud multistate settlement, and as we have discussed, health privacy remains a hot topic and is already back in the news. On October 17th, 33 AGs led by Indiana, announced a multistate settlement in the form of a judgment with a Puerto Rico-based health care clearinghouse, Inmediata, for what the AGs alleged was a failure to appropriately safeguard data and a delayed and flawed notification to consumers of a coding issue. As a result, the states said protected health information (PHI) of approximately 1.5 million consumers was exposed to public online searches for almost three years. The AGs alleged, among other things, violations of the Health Insurance Portability and Accountability Act’s (HIPAA) Security Rule and its Breach Notification Rule.

Although the U.S. Department of Health and Human Services’ Office for Civil Rights is the most well-known enforcer of HIPAA compliance, state AGs have played a growing role in enforcing compliance with HIPAA’s Rules. In 2009, the Health Information Technology for Clinical and Economic Health (HITECH) Act authorized state AGs to bring civil actions on behalf of state residents impacted by violations of the HIPAA Privacy and Security Rules, as well as its Breach Notification Rule. The Connecticut AG was the first to exercise this enforcement right in 2010 against Health Net Inc. for a security breach involving private medical records and financial information. While much attention has been given to the passage of recent broad comprehensive state privacy laws and those specific to health, such as Washington’s My Health My Data Act and Connecticut’s recent amendments to its data privacy law adding provisions specifically related to health data, it is important to remember that states may also have specific laws that are similar to HIPAA but include more expansive definitions, such as the Texas Medical Records Privacy Act.

Here, the AGs alleged that Inmediata violated HIPAA’s obligations by failing to implement reasonable data security, including failing to conduct a secure code review at any point prior to the breach, and then failing to provide affected consumers with timely and complete information regarding the breach, as required by law. The settlement requires Inmediata to pay a $1.4 million fine, divided among the participating states, and requires the company to implement strong security practices going forward. This is just the most recent example of the increasing activity by state AGs utilizing their HIPAA enforcement authority. We will keep you apprised of any developments in this area as they unfold.

]]>
CFPB Previews Proposals that Could Fundamentally Shift Data Broker Business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cfpb-previews-proposals-that-could-fundamentally-shift-data-broker-business https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cfpb-previews-proposals-that-could-fundamentally-shift-data-broker-business Mon, 25 Sep 2023 12:00:00 -0400 In connection with its convening of a panel of small businesses to provide input on potential regulatory actions, the CFPB released an outline of its proposals to use its rulemaking authority under the Fair Credit Reporting Act (FCRA) to cover data brokers and prohibit the use of medical debt collection data in making credit decisions. While the outline does not include any specific language, it evidences the Bureau’s desire to fundamentally alter the data broker business model by expanding the definition of “consumer reporting agency” (CRA) to cover more data brokers, and limit their ability to share consumer information without a permissible purpose. The CFPB also seeks to prevent CRAs from providing credit header data to third parties for purposes beyond the scope of the FCRA. In effect, the Bureau intends to significantly curtail the sale of certain personal data for marketing purposes.

This is just the latest development showing an increased, nationwide focus on the practices of data brokers, which we have detailed in this blog, and which recently led to a strict new data broker regulation in the state of California. Depending on how the CFPB’s proposals play out, they could transform how data brokers are regulated in this country.

Background on the FCRA

The FCRA covers “consumer reports” and imposes restrictions on CRAs that create and sell these reports, furnishers that provide data to CRAs, and users that consider consumer reports when making eligibility determinations about consumers. The famously circular statute (“famous” being an admittedly relative term when discussing a federal statute) defines a consumer report to be the communication of any information by a CRA bearing on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for (A) credit or insurance to be used primarily for personal, family, or household purposes; (B) employment purposes; or (C) any other permissible purpose authorized under FCRA section 604.

Meanwhile, CRA is defined as any person that regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information on consumers for the purpose of furnishing consumer reports to third parties.

At the risk of oversimplifying things, in general, CRAs are those entities that assemble information about consumers for the purpose of providing reports to third parties for use in making determinations about consumers’ eligibility for credit, employment, or housing. Data brokers, on the other hand, are entities that collect information about consumers to be provided to third parties for non-FCRA purposes such as fraud prevention and marketing. Sometimes enforcers have alleged that companies purporting to be data brokers were, in fact, CRAs because they were selling consumers’ information to third parties such as background screeners and employers (see, e.g., the FTC’s cases against Spokeo and TruthFinder). More often, though, data brokers sell consumer information for marketing purposes without triggering the FCRA. There has been widespread concern that these data brokers can amass incredibly sensitive information about consumers without their knowledge, and that consumers have no control over how the data is shared.

The CFPB’s Proposal

The CFPB’s proposal would classify any report that includes data such as payment history, income, or criminal records as a consumer report. That would mean that any data broker selling this information would be a CRA and would only be able to share it for a permissible purpose – that is, for use in eligibility determinations. [The outline does not include a proposed definition of “sell” but, depending on how it is defined, the scope of the provision’s reach could be quite expansive.] So, for example, a data broker could no longer provide information about a consumer that includes her individual or household income (more on this later) to a retailer for marketing purposes. Data brokers could no longer sell criminal records to individuals that want to vet their dates.

The CFPB is also considering whether it should define “assembling and evaluating” to cover intermediaries or vendors that facilitate the transfer of consumer report information. Traditionally, companies that were mere conduits of information have not been considered to be assembling and evaluating information — and, hence, were not viewed as CRAs (see FTC’s 40 Years Report at 29). It is unclear if the Bureau intends to include “dumb pipes” in its definition of CRA, or just those vendors that clean or organize data before providing it to their clients.

While CRAs are prohibited from providing consumer reports without a permissible purpose, there has been a longstanding exception for the provision of credit header information. In particular, reports limited to identifying information such as name, address, previous address, SSN, and phone number, have been considered exempt from the definition of consumer report if they do not bear on one of the seven factors and are not used to make an eligibility determination (see 40 Years Report at 21). Relying in this exemption, CRAs have provided credit header data to purchasers for use in marketing and fraud detection purposes. The CFPB’s proposal would consider credit header data to be a consumer report and would eliminate a CRA’s ability to provide this information for fraud prevention or marketing.

The outline also includes discussion of the following topics:

Target Marketing

The Bureau is considering clarifying that CRAs cannot use any consumer report information for targeted marketing. The CFPB is concerned that CRAs may be using consumer report data to help customers target marketing, in violation of the FCRA. Per the Bureau, these CRAs may incorrectly believe that this use of data is outside the scope of the FCRA if they do not furnish the information directly to clients, but rather provide the marketing to the consumers themselves.

Aggregated and Household Data

Significantly, the CFPB is also contemplating whether aggregated and household level data should be considered a consumer report. This would be a major change. A prohibition on the use of aggregated and household level data, such as the average income in a geographic area, for marketing purposes would reverberate across the marketplace.

Consumer Consent

Consumers can permit CRAs to share their consumer reports by providing written consent. The CFPB’s outline notes that it is considering placing limitations on how (and by whom) the consent may be collected, as well as on the scope of the consent, presumably to ensure that the consent is informed and meaningful. It is also mulling mechanisms through which consent may be revoked.

Legitimate Business Need

Another aspect of a potential proposal would be to limit the scope of the permissible purpose allowing a user to obtain a consumer report when it has a legitimate business need in connection with a business transaction initiated by the consumer. The CFPB may specify that this permissible business purpose must be for a personal, family, or household purpose. A legitimate business purpose related to account review would require that the consumer report be necessary to make a determination about a consumer’s continued eligibility for the account.

Data Security

Regulators have long made clear that they see the privacy provisions of the FCRA (limiting the use of consumer reports to certain permissible purposes) as requiring CRAs to take reasonable measures to protect those reports (see, e.g., the FTC’s case against SettlementOne and the statement of Commissioners Brill, Leibowitz, Rosch, and Ramirez). The Bureau’s outline notes that its proposal may address CRAs’ data security obligations under the FCRA. In addition, the CFPB is considering whether it should hold CRAs strictly liable for data breaches by considering the unauthorized release of any consumer report to be a violation of Section 604, which prohibits furnishing a consumer report to anyone without a permissible purpose.

Disputes

Under the FCRA, consumers have the ability to dispute inaccurate information contained in their consumer reports with the CRA or directly with the furnisher of the information. Some private litigation has focused on whether CRAs and furnishers have a duty to investigate so-called legal disputes. The CFPB’s proposal would make clear that the FCRA requires investigation of both legal and factual disputes. Simply put, a legal dispute is a dispute that hinges on an interpretation of a law. The Bureau’s outline uses the example of a state foreclosure law. If a consumer disputes the accuracy of a report that lists him as having mortgage debt, the CFPB would require that the CRA investigate whether the state’s anti-deficiency statute required the debt to be extinguished.

In addition, the CFPB says it wants to tackle what it considers to be systemic issues that affect the completeness and accuracy of consumer reports – for example, outdated software or deficiencies in a furnisher’s policies and procedures to assure data accuracy. The outline notes that the CFPB is thinking about ways that CRAs and furnishers could be notified of potential systemic issues, which they would have to investigate and, if necessary, address. Among the CFPB’s proposals for consideration are requiring a mechanism where consumers could report suspected systemic issues. It is also considering whether consumers should be notified of any systemic issues that affected their reports, even if the issue was identified in response to a complaint from another consumer. This could potentially result in consumers receiving notices about issues at CRAs that may have affected their reports, but did not have a negative impact on them because the inaccurate reports were not shared.

Medical Debt Collection Information

Finally, the CFPB is considering revising Reg V which, among other things, covers medical debt collection information. The potential revisions would prohibit creditors from using this information to make credit eligibility determinations, and prohibit CRAs from including this information on consumer reports for credit eligibility. Medical debt collection information has long been a source of concern for consumers, legislators, and regulators, since it can prevent consumers from obtaining credit following a medical emergency or be used to coerce consumers into paying spurious or false unpaid medical bills. In addition, the CFPB believes there is compelling evidence that this information does not have predictive value for credit decisions. The Big 3 CRAs ceased reporting paid medical collection debt, medical collection debt under $500, and any medical collection debt that is less than one year past due. The Bureau’s proposal would further limit the ability of medical debt collection tradelines to affect a consumer’s ability to obtain credit.

Next Steps

The CFPB is accepting comments on this outline until October 30, 2023 and is especially interested in feedback from small businesses that would be affected by the rule. Once the Bureau completes this process, which is required under the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA), it can issue a more formal rulemaking proposal which will be put out for public comment. It seems unlikely that any proposal would be announced before 2024. However, the CFPB is clear that it envisions a sea of change in the scope of the FCRA, and businesses should be ready to provide input and comment.

]]>
Practical Privacy: Lessons from the Front Lines https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/practical-privacy-lessons-from-the-front-lines https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/practical-privacy-lessons-from-the-front-lines Thu, 31 Aug 2023 00:00:00 -0400 With the continuing onslaught of state privacy laws, it’s easy to become overwhelmed by the number of new legal obligations while also trying to stay focused on identifying and mitigating the most pressing legal and business risks. Over the past couple of months, we’ve had the opportunity to meet with privacy professionals to hear about their top challenges and offer some practical perspectives of our own.

Three of the topics that stood out during these discussions – there were several others – were: understanding and managing data protection impact assessments (DPIAs), assessing sensitive personal information (SPI) risks, and implementing data deletion obligations. This post shares some of the tips that emerged from these sessions.

Data Protection Impact Assessments

Four states require DPIAs today for certain processing activities, and laws that go into effect in five additional states beginning in 2024 will require them, too. Across most of these states, the activities that trigger the need to conduct a DPIA include targeted advertising, data sales, and sensitive data processing. Beyond these clearly defined starting points, however, practical challenges abound. What form should a DPIA take? Who should be responsible for drafting the assessment? What are the best practices for keeping DPIAs up to date?

One way to look at these questions is that DPIAs tell the story about how a company uses personal data. Regulators will be one audience for these stories. Some states’ laws allow the attorneys general to request production of DPIAs from organizations. California law requires businesses to submit their DPIAs to the CPPA on a “regular basis” (with details now set forth in draft regulations). Regulators will expect to see (in the words of the Colorado Privacy Act’s implementing regulations) a “genuine, thoughtful analysis” of the benefits, potential harms, and mitigations in a company’s data practices.

At the same time, although some state privacy laws provide protection for attorney-client privilege and confidentiality, we expect DPIAs to generate investigations and to have their privilege and confidentiality protections challenged. Carefully planning the diligence and drafting stages of a DPIA – and taking care to maintain safeguards for communications that involve legal advice – is critical to ensuring that DPIAs are accurate and comprehensive while minimizing additional risk to the company.

Finding internal champions and identifying key stakeholders are also critical steps. DPIAs take time away from IT, engineering, business, legal, compliance, and privacy teams who have day jobs. In most cases, their contributions are essential to assemble an accurate picture of the activity that’s at the center of a given DPIA.

The message that spurs these teams to participate meaningfully in the DPIA process will vary. In some cases, buy-in might arise from a shared understanding that the company needs to align on whether its current practices are sufficient to protect against known risks. In other instances, a clear message of support from the top of the organization might be necessary.

In short, there isn’t a single format or process that will work for everyone. However, recognizing that the stakes involved in DPIAs are significant and planning accordingly are first steps toward identifying which processing activities to tackle first and how to go about it.

Sensitive Personal Information

In addition to triggering a DPIA obligation, SPI processing under state laws and emerging enforcement precedent may require opt-in consent. Identifying SPI collection and use is therefore a growing priority for many privacy professionals.

But the expansive definition of SPI under state privacy laws is only part of the equation. Sector-specific laws, such as the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule and the Illinois Biometric Information Privacy Act, expand the range of sensitive data that receives heightened protections. These laws have become an increasing focus for regulators and the plaintiffs’ bar at the same time that data from these sectors is becoming more valuable for new services and, in some instances, advertising.

Where SPI is used in marketing and advertising, companies face compliance challenges and potential exposure to private suits. Using alternatives to SPI can mitigate these risks. For example, in lieu of SPI, some companies are exploring the use of aggregated demographic data to power insights or target advertising based on non-sensitive purchasing behavior.

Practical approach to SPI

  • Cataloging data starts with thoughtful DPIAs and a robust understanding of the business use cases with SPI.
  • Consent is the baseline expectation for SPI processing. Consider building a consent management infrastructure that accounts for both direct collection and sourced (or inferred) data.
  • Explore emerging alternatives to SPI and implement mitigation measures in the meantime.
  • Think for today and for tomorrow. Short-term and long-term plans are crucial for developing a comprehensive and durable risk-management strategy. Set a cadence to revisit the plans.

Data Deletion Obligations

All comprehensive state privacy laws that have been enacted so far give consumers the right to request deletion. State laws vary in the level of detail they provide about deletion – regulations in California and Colorado are quite specific in their procedures – but all provide significant leeway to retain data for internal purposes that are reasonably aligned with consumers’ expectations. In practice, it is not uncommon to preserve some data to meet operational needs or comply with legal obligations.

This leads to two challenges. First, companies need to communicate clearly with consumers, service providers, and third parties about how they’re fulfilling deletion requests. Second, companies need ways to ensure that data they keep under an exemption is not used for other purposes.

A few practical steps can help:

  • Prior to developing a process for responding to deletion requests, map out your data to understand what personal information you have, where it is located, and with whom you share it. Identify any legal obligations surrounding how long you must keep it, including any minimum retention periods.
  • Develop and maintain systems to notify service providers and third parties about data deletion requests. Methods for sending deletion requests to partners vary widely, from self-serve, automated interfaces to ad hoc requests that are handled case-by-case, so be prepared to work with a wide range of processes.
  • Communicate clearly with consumers about deletion requests, whether the request will be approved in whole, in part, or not at all.

If Kelley Drye can help your organization develop a practical approach to building and maintaining a robust privacy program, please contact any member of our Privacy Group.

]]>
Mounting Focus on Data Brokers: Is More Regulation Coming? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/mounting-focus-on-data-brokers-is-more-regulation-coming https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/mounting-focus-on-data-brokers-is-more-regulation-coming Thu, 24 Aug 2023 00:00:00 -0400 During the past year, there’s been a flurry of regulatory activity related to data brokers. Whether in Congress or state legislatures, at federal agencies or the White House, many policymakers are pushing in the direction of increased regulation. For those not following this issue closely, here’s a snapshot of some key developments, starting with some history:

Background on Data Broker Regulation

The debate surrounding data brokers and regulation isn’t new. For decades, policymakers and enforcers have raised concerns about the collection and sale of consumer data by these entities, citing the sensitive nature of the information and profiles that they sell, the use of this data in making consequential decisions about consumers, and the invisibility of most data brokers to the public. (See, e.g., here, here, and here.)

In the 1970s, Congress passed the Fair Credit Reporting Act (the nation’s first commercial privacy law) to regulate consumer reporting agencies (CRAs), an important subset of these entities. The FCRA sets forth data privacy and accuracy requirements when CRAs sell (and companies furnish and use) consumer data for decisions affecting people’s eligibility for credit, jobs, and insurance. The FCRA didn’t end the debate, however. Since then, some policymakers have pressed for broader regulation of data brokers, especially with the advent of mobile devices and other technological advances, enabling data brokers to collect more detailed data about consumers, and to make more granular inferences and predictions, and then sell this information to the public. In response, data brokers have pointed to the beneficial services they provide, and have argued that existing laws (including the FCRA, the Gramm Leach Bliley Act, the FTC Act, and now numerous state privacy laws) are adequate to address any harms that occur.

Recently, this debate has accelerated, as shown by the increased regulatory activity we are seeing today. For some policymakers, the repeal of Roe v. Wade and its implications for reproductive privacy has added an important new dimension to the debate. On April 15, the White House convened a roundtable of government officials, academics, advocates, and other experts to discuss “harmful data broker practices” and provide further impetus for regulation.

Congress

So, what specific proposals are we seeing? Not surprisingly, some of them are coming from Congress. In July, we blogged about two bipartisan efforts to stop the government from purchasing consumers’ location and web browsing and search history from data brokers, absent a warrant or other due process measures. One of these proposals (an amendment to the House National Defense Authority Act bill) would restrict such purchases by DOD. Another (the Fourth Amendment is Not for Sale Act, now introduced in both the House and the Senate) would restrict such purchases more broadly across the federal government. All of these bills are pending, with Congress now in recess.

Readers also may recall that the leading federal privacy bill (the bipartisan American Data Privacy and Protection Act) contains strict data broker provisions requiring online registration and a one-stop mechanism allowing consumers to delete data held by data brokers and prevent further collection by these entities. Other recent federal bills (e.g., the bipartisan DELETE Act) contain even stricter data broker requirements.

Federal Trade Commission

The FTC is also very active in this area. In a 2022 blogpost, an FTC official warned that the FTC will use the “full scope of its authorities” to stop the “illegal use and sharing” of consumers’ location, health, and other sensitive data. Soon after, the FTC filed a lawsuit against data broker Kochava, alleging that its sale of location data obtained from mobile devices harms consumers and is legally “unfair” because the data can reveal sensitive locations that consumers visit, such as reproductive health clinics, places of worship, homeless and domestic violence shelters, and addiction recovery facilities. In addition, the ANPR in the FTC’s Commercial Surveillance and Data Security Rulemaking is replete with references to data brokers and data sales, suggesting that this could be a focus of any rule it proposes.

Like Congressional efforts, the FTC’s actions here are pending. In Kochava, the court dismissed the FTC’s initial complaint due to what it viewed as the hypothetical nature of the FTC’s injury allegations, but the FTC has filed a new complaint (under seal). In the FTC’s rulemaking, the comment period for the ANPR closed last November, so the FTC could release a proposed rule any day now. We await news on both fronts.

California – SB 362

No privacy discussion would be complete without California. And sure enough, the California legislature is currently considering new data broker legislation. In brief, SB 362 would amend the state’s existing data broker law by establishing an “accessible deletion mechanism” where consumers can direct data brokers to delete their information. This would in turn trigger a ban on further data collection by these entities, unless consumers opt in. The law also would allow an “authorized agent” to request deletion for the consumer, require independent compliance audits every three years, and mandate regular reports to the public and to the California Consumer Protection Agency. Due to the broad definition of “data broker,” the bill would cover a wide array of entities, including members of the advertising industry.

If passed, this law would substantially up the ante for data brokers operating in California, and could spread to other states. Currently, eleven states have enacted comprehensive baseline privacy laws, but only a few have data broker laws, with mostly modest requirements. Not surprisingly, opposition to the bill is strong in the data broker and ad industries, who (according to news reports) say it will hurt anti-fraud efforts and the economy, and have launched an effort to defeat the bill. Because California’s legislature adjourns September 14, the window for action is closing soon.

Consumer Financial Protection Bureau

Finally, in what could be the most consequential data broker regulation of all, CFPB Director Rohit Chopra just announced (on the day of the White House roundtable) that the CFPB will soon launch a rulemaking to “modernize” the FCRA so that it reflects how today’s data brokers “build even more complex profiles about our searches, our clicks, our payments, and our locations” and “impermissibly disclose sensitive contact information” of people who don’t want to be contacted, such as domestic violence survivors.

Among other things, per Director Chopra, the CFPB is considering proposals to bring within the FCRA (1) a data broker’s sale of certain types of data (e.g., payment history, income, criminal records) because the data is “typically” used to make credit, employment, or certain other eligibility determinations and (2) credit header information, a major source of information for data brokers that has long been considered to fall outside the FCRA. Such proposals would dramatically extend the FCRA’s reach to a broader class of data brokers than are currently covered. According to Director Chopra, the CFPB will publish an outline of proposals and alternatives next month.

* * *

All of the above proposals are now pending, so it’s not clear whether they will reach fruition or what shape they will ultimately take. However, the sheer volume of activity shows that data brokers are in the spotlight and are likely to remain there for a while

]]>
Kids’ Privacy and Safety Redux: Amended KOSA and COPPA 2.0 Advance By Voice Vote https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/kids-privacy-and-safety-redux-amended-kosa-and-coppa-2-0-advance-by-voice-vote https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/kids-privacy-and-safety-redux-amended-kosa-and-coppa-2-0-advance-by-voice-vote Tue, 01 Aug 2023 00:00:00 -0400 Last year, the Senate Commerce Committee marked up two bipartisan bills to protect kids’ privacy and safety – the Kids Online Safety Act (KOSA), and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) – amidst high hopes that the bills would get a vote on the Senate floor. With comprehensive privacy legislation still tripped up over preemption and private rights of action, policymakers thought that legislation to protect kids would have the best chance of passage. The bills never made it to the floor, however, and they died in the 117th Congress.

This year, the bills’ sponsors are trying again and, on July 27, the Committee marked up amended versions of both bills. (The markups came up on heels of President Biden’s once again urging passage of these bills in public remarks.) The amendments to the bills address policy concerns that various groups have continued to raise since the bills were introduced last year. We watched the July 27 hearing to see what we might learn about the prospects for the bills’ passage in 2023.

Brief Background on the Bills, as Amended

For those in need a reminder, KOSA (sponsored by Senators Blumenthal (D-CT) and Blackburn (R-TN)) is a kids and teen safety bill, designed to reduce harmful content on social media and to give minors and parents more tools and controls to block or filter such content. The bill has strong support from members of the child safety, medical, and consumer advocacy communities. At the same time, however, other consumer advocates, as well as the tech community, have criticized features of the bill that they believe would block minors’ access to content (including about LGTBQ+ and abortion issues) and/or potentially require the collection of more data from or about minors to determine their age. To address these concerns, the new version of the bill amends various definitions, as well as the standard governing when companies are expected to know who is minor, among other changes.

COPPA 2.0 (sponsored by Senators Markey (D-MA) and Cassidy (R-LA)), by contrast, is a privacy bill, the primary purpose of which is to extend privacy protections to teens 13 through 16 and change COPPA’s “actual knowledge” standard so that websites and apps have greater obligations to know when they are dealing with minors. Like KOSA, various groups have expressed concern about whether the proposal would lead to restrictions on content available to minors. Also like KOSA, the new version of the bill includes various changes, including revisions to the knowledge standard proposed in last year’s version.

The Markup

The markup this year was part of a full Committee Executive Session considering multiple bills on a range of topics. At the session, the Committee approved both of the amended kids’ bills, as well as several additional amendments to each of them (most of which were relatively minor). While there wasn’t extensive discussion surrounding these bills, Committee Members took the opportunity to highlight the importance of kids’ privacy and safety, as well as future actions that they’re contemplating in this area. Here’s our rundown of notable moments from the hearing:

Chair Cantwell (D-WA) led the session, addressing the multiple bills being considered (including, e.g., legislation on the topics of satellite waste in space and AM radio capabilities in cars). With regard to children’s privacy, she described COPPA 2.0 as a “vital upgrade” to protect minors, closing loopholes and protecting teens 13 through 16. KOSA, she explained, is also long overdue, as there is an ongoing mental health crisis related to social media’s impact on children. Cantwell acknowledged, however, that there are still outstanding concerns among groups who would be affected by the legislation (including members of the LGBTQ+ community), requiring additional work before the bills reach the floor. She added that these bills are “not the last” of the privacy issues that the Committee will consider and that she hopes, when the Committee returns in September, that it will consider other privacy issues as well.

Ranking Member Cruz (R-TX) offered support for both bills advancing out of the Committee. The Internet hasn’t come without cost, he said, especially for children. He further chided Big Tech companies for failing to protect children or give parents the safeguards and controls they need to protect their kids. Finally, he suggested adding a potential preemption clause to KOSA, as multiple states have passed laws that may be inconsistent with parts of the bill.

Senator Schatz (D-HI) gave an impassioned speech about protecting children, stating that we are in a crisis – an epidemic of teen mental illness due to the algorithmic boosting of harmful content online. He cited a study from the CDC showing that two thirds of high-school girls feel persistently sad or hopeless, and that 22% of all high-school students have seriously considered suicide. Due to his concerns about these issues, Schatz initially offered an amendment to KOSA – his Protecting Kids on Social Media Act – which would prohibit users under the age of 13 from accessing social media platforms, require parental consent for children 13 through 17, and ban the recommendation of content using algorithms to all minors under 18. However, he later withdrew his amendment, noting “productive conversations” he’d had with Cantwell and Cruz. For their part, Cantwell and Cruz expressed eagerness to work with Schatz on these issues in the fall.

Senator Thune (R-SD) offered an amendment to KOSA that would require platforms to notify users if they are using an algorithm, which passed by voice vote. Senators Blackburn (R-TN) and Klobuchar (D-MN) both expressed frustration about how long it has taken to address kids’ privacy and safety issues and said now is the time for action. Other Members, such as Senators Sullivan (R-AK) and Welch (D-VT), expressed support for the bills and their commitment to protecting children’s safety. Finally, Senator Markey (D-MA) confirmed his commitment to address concerns raised by the LGBTQ+ community and others, but expressed confidence that he would be able to resolve them.

* * *

Bottom line: While the Committee approved both bills, there will likely to more changes before either bill reaches the Senate floor. Further, while President Biden and Senate Majority Leader Schumer (D-NY) have both stated (at times) that these bills are a priority, the clock in the 118th Congress is ticking.

]]>
Join Kelley Drye for a Webinar with the Colorado Attorney General’s Office: Colorado Privacy Act, AI, and Teen Mental Health https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/join-kelley-drye-for-a-webinar-with-the-colorado-attorney-generals-office-colorado-privacy-act-ai-and-teen-mental-health https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/join-kelley-drye-for-a-webinar-with-the-colorado-attorney-generals-office-colorado-privacy-act-ai-and-teen-mental-health Sun, 23 Jul 2023 00:00:00 -0400 Join Kelley Drye’s State Attorneys General Ad Law Team on Monday, July 24, 2023 at 3 pm ET for the next program in the 2023 State Attorneys General Webinar Series: Colorado Attorney General’s Office – Colorado Privacy Act, AI, and Teen Mental Health.

Special guest speakers Colorado Attorney General Phil Weiser and Nathan Blake, Deputy Attorney General for Consumer Protection, will join Kelley Drye State Attorneys General practice Co-Chair Paul Singer, Special Counsel Abby Stempson, and Senior Associate Beth Chun for a discussion on a variety of important consumer protection topics, including teen mental health and the increased use of artificial intelligence. AG Weiser has been a national leader on these important topics and will discuss how AGs are using their existing and new consumer protection authority to examine emerging issues.

In addition, the Colorado Privacy Act went into effect on July 1. AG Weiser marked the occasion with a series of letters sent to businesses announcing the office’s intent to enforce the new law. Join us to learn more about the Act, and the office’s enforcement plans.

REGISTER HERE

]]>
Spotlight on Data Sales and the Fourth Amendment: Two Bipartisan Bills in the House https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/spotlight-on-data-sales-and-the-fourth-amendment-two-bipartisan-bills-in-the-house https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/spotlight-on-data-sales-and-the-fourth-amendment-two-bipartisan-bills-in-the-house Fri, 21 Jul 2023 00:00:00 -0400 With so much going on in the privacy space, it can be hard to keep track of everything. For example, while you were struggling to keep pace with rapidly advancing state privacy laws, FTC and EU privacy developments, market and technological changes, and various proposals to protect children’s privacy, you might have missed some eye-opening developments regarding the government’s purchase of consumer data from data brokers and other third party data sellers.

Specifically, the House has advanced two bipartisan proposals to limit such purchases. If even one of these proposals were enacted, it would represent one of the more significant privacy actions taken by Congress in a long time. Here are the details:

Privacy Amendment in the House NDAA Bill

Some of the last minute amendments to the National Defense Authorization Act (NDAA) bill passed by the House have generated a lot of controversy and concern. But a privacy amendment offered by Rep. Davidson (R-OH) slipped through largely unnoticed.

The amendment would prohibit the Department of Defense (DOD) from “acquir[ing] location information, web browsing history, Internet search history, and Fourth Amendment protected information” of US persons inside the US for (1) foreign intelligence purposes, except as permitted under the Foreign Intelligence Surveillance Act (FISA), or (2) law enforcement purposes, except with a warrant demonstrating probable cause.

The amendment would further require that, if the interception, compelled production, or physical search and seizure of information would require a warrant, court order, or subpoena under law, DOD may not obtain that information from a third party without obtaining the warrant, court order, or subpoena. There is an exception if the information being sought is aggregated or anonymized so that it “cannot reasonably be de-anonymized or otherwise linked to any individual or groups of individuals” and DOD does not disclose the information to any law enforcement agency or to the intelligence community.

The other sponsors of this amendment (seven of them) included Rep. Jacobs (D-CA). As readers may know, Jacobs is also the lead sponsor of the My Body, My Data Act, a bill that would limit how women’s reproductive and sexual health data may be collected, used, and shared with third parties.

The House’s “The Fourth Amendment is Not for Sale” Act

Just as the NDAA was passing the House, Rep. Davidson also introduced a bill (HR. 4639) to place similar limits on how the government (not just DOD) acquires consumer data from electronic communications providers (e.g., providers of phone and email service). The bill is co-sponsored by House Judiciary Ranking Member Nadler (D-NY), as well as Rep. Jacobs and other supporters of the NDAA amendment. Like the NDAA amendment, the bill provides protections for consumers’ location information, web browsing history, Internet search history, and Fourth Amendment protected information.

In brief, and as summarized here, the bill amends portions of the Electronic Communications Privacy Act of 1986 (ECPA), which restricts how the government can obtain people’s communications from electronic service providers. Among other things, the House bill requires the government to obtain a court order to obtain consumer data from data brokers. It also “closes the loophole” that has allowed the government to purchase data from data brokers and other sellers without obtaining a warrant, court order, or subpoena as required under ECPA and FISA. On July 19, the House Judiciary Committee marked up the bill, with broad support from both sides of the aisle. The next step will be a vote on the House floor.

Rep. Davidson’s bill is similar to a proposal (also called the Fourth Amendment is Not for Sale Act) that Sen. Wyden (D-OR) introduced last year in the Senate, with support from a number of bipartisan co-sponsors. Wyden has told the press (in an article behind a pay wall) that he doesn’t intend to reintroduce that bill this year, but that he plans to incorporate similar protections into a comprehensive surveillance reform bill that he is developing.

Why is this Significant?

The data privacy limits proposed here reflect simmering concerns about the ease with which the government can purchase consumers’ sensitive data from data brokers and other sellers to get around the privacy restrictions imposed under the Constitution and US laws. The repeal of Dobbs has added to these concerns, raising fears that government entities in anti-abortion states will purchase information revealing details about women’s health and location and use it for law enforcement purposes.

As noted above, federal laws (including FISA and ECPA) impose limits on the ability of the government to obtain consumer information from certain entities and/or for certain activities without a warrant, court order, or subpoena. Further, in US v. Carpenter, the Supreme Court held that the government’s acquisition of a person’s cell phone records from a wireless carrier (which can reveal a person’s precise location over time) was a 4th amendment protected search, requiring a warrant supported by probable cause.

However, several years ago, the Wall Street Journal and other news outlets started reporting that the government was getting around these restrictions by purchasing consumer data from data brokers and other sellers, rather than seeking it directly, pursuant to federal laws. This prompted efforts in Congress to close this “loophole” through legislation such as Sen. Wyden’s bill. Now, with these two House proposals, these efforts appear to be gaining bipartisan steam.

It’s far from certain that these proposals will end up becoming law. The House NDAA must now be reconciled with the version passed in the Senate, which doesn’t include Davidson’s amendment language. Further, if Wyden doesn’t re-introduce his bill in the Senate, but instead incorporates its protections in broader and potentially more controversial legislation, it could get tied up and fail to advance. Nevertheless, the bipartisan support that these proposals have received in the House could be a sign of more to come.

]]>