Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Wed, 08 May 2024 13:48:37 -0400 60 hourly 1 New Bipartisan Federal Privacy Bill – Breakthrough, Too Late, or Both? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-bipartisan-federal-privacy-bill-breakthrough-too-late-or-both https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-bipartisan-federal-privacy-bill-breakthrough-too-late-or-both Tue, 07 Jun 2022 16:53:18 -0400 On Friday June 3, a bipartisan group of leaders from key House and Senate committees released a new “discussion draft” bill to establish nationwide standards for consumer privacy. The proposal (the American Data Privacy and Protection Act) builds on prior bills put forth by both Democrats and Republicans, as well as principles and provisions contained in the GDPR and State privacy laws. Of significance, the bill reflects bipartisan compromise on two thorny issues that have divided the parties for years – whether to preempt state privacy laws and/or include a private right of action. While the bill has been hailed as a “breakthrough,” the prospects for passage are uncertain, particularly in this busy election year.

Why is this bill significant?

As most of our readers know, the US has no overarching federal privacy law – only sector-specific laws such as GLBA and COPPA. This patchy, confusing scheme has become even more complex with passage of the GDPR (which applies to US multinational companies) and five comprehensive State laws. While many federal bills have come and gone over the years, none reflect the high-level bipartisan compromise evident here – both on longstanding privacy concepts (notice, choice, access, security) as well as more specific concerns about discrimination, algorithms, platforms, data brokers, targeted ads, and corporate accountability. If passed, the bill would apply to virtually all companies doing business in the US.

Why is this happening now?

While many observers wish a bipartisan bill had been proposed earlier, the forces driving this bill forward have never been stronger. Passage of State laws is accelerating, the EU is exerting greater influence over privacy worldwide, and the FTC is planning to launch wide-ranging privacy rulemakings. In addition, Senator Wicker, one of the bill’s authors and a longtime leader on privacy, may soon vacate his slot as Commerce’s top Republican, motivating him to cement his legacy now. To cap it all off, while election year is indeed a difficult year to pass a bill like this, it’s also creating pressure to make one last effort on privacy.

Key elements of the law

The law is extremely comprehensive and ambitious but, as expected, reflects compromise on certain issues. While we can’t possibly summarize everything in a blogpost, here are some of the highlights:

  • Scope: The bill covers entities subject to the FTC Act, as well as common carriers and non-profits. It applies to data that is linked or linkable to an individual or device (if linkable to one or more individuals), including derived data and unique identifiers. There are exclusions for de-identified data, employee data, and publicly available data, but not for small businesses (though they’re excluded from certain provisions). The net effect is that the bill covers virtually every company in the US and a good portion of their data.
  • “Standard” Provisions: The ADPPA contains many elements that are now fairly standard in privacy laws and bills – privacy notices; the right to access, correct, and delete data, and to request it in a portable format; data minimization; privacy by design; data security; and corporate accountability. While these requirements differ in various ways from those in the State laws, the most notable departure is the strictness of the data minimization requirement (limiting the collection, processing, and transfer of data to what’s necessary to provide a specific product or service requested by an individual, or to communicate in the context of the relationship). “Large data holders” (platforms and other large companies) must comply with enhanced notice and accountability requirements.
  • Duty of Loyalty: The ADPPA includes several requirements in a section called “Duty of Loyalty” (a section that borrows its title, but not its contents, from a bill introduced by Senator Schatz). The notable requirements in this section include:
    • With certain exceptions, companies can’t collect, process, or transfer SSNs or nonconsensual intimate images. They also can’t transfer passwords.
    • With certain exceptions, companies can’t collect, process, or transfer biometric or genetic data without affirmative express consent. (Unlike State privacy laws, these protections apply even when this data doesn’t identify, or can’t reasonably identify, an individual.)
    • With certain exceptions, companies can’t transfer a person’s precise geolocation, search or browsing history, or physical activity from their device without affirmative express consent.

Note that some of these provisions appear to overlap and/or conflict with other provisions of the bill. In particular, because biometric and genetic information, precise geolocation, online activities, and log-in credentials are defined as “sensitive covered data,” they’re also subject to the opt-in requirements discussed below. The restrictions on search and browsing data may also conflict with the law’s purported opt-out regime for targeted advertising.

  • Rights to Consent & Object: Like many privacy laws, the ADPPA requires opt in for certain practices and opt out for others.
    • Opt in is required before a company can process, collect, or transfer sensitive data. The bill defines sensitive data broadly and gives the FTC rulemaking authority to add new categories. Of note, the definition includes health, financial, biometric, genetic, and precise geolocation data; a person’s private communications, media viewing history, and online activities; data revealing race, religion, or union membership (if such data isn’t public); and the data of individuals under 17 (although the age is in brackets, indicating that it is still under discussion). As noted above, including “online activities” in here may conflict with the opt out for targeted advertising.
    • Opt out is required for data transfers to third parties (called “sales” in State laws) and targeted advertising (defined to exclude contextual advertising, ad reporting and measurement, and certain first party marketing). The bill also asks the FTC to study the feasibility of a unified opt-out mechanism (similar to Global Privacy Control in State laws) and authorizes the FTC to implement it via rulemaking.
  • “Third Party Collecting Entities”: The bill includes special requirements for “third party collecting entities” – companies (other than service providers) that derive their principal source of revenue from processing or transferring the data of individuals that the entity didn’t collect directly from the individual. This provision is clearly designed to target data brokers and potentially ad networks or processors that operate behind the scenes. Such entities must register with the FTC and comply with a Do Not Collect mechanism allowing individuals to delete their data. Also, companies that share data with these entities must identify each of them by name in their privacy policies.
  • Children & Minors: Building on recent concerns about harmful content directed at kids and teens, the bill would ban targeted advertising to minors under 17, as well as data transfers without consent. It also directs the FTC to create a division for Youth Privacy and Marketing, and asks the FTC’s IG to assess the effectiveness of COPPA’s safe harbor provisions. Here, the bill is clearly trying to address perceived weaknesses in COPPA – both its failure to protect teens and concerns that its safe harbor programs are inadequate.
  • Algorithmic Fairness: To address rising concerns about the link between data collection and civil rights, the bill restricts collecting, processing, or transferring data in a manner that is discriminatory or that makes unavailable equal enjoyment of goods or services on the basis of race, religion, disability, or other protected categories. It would also require “large data holders” to conduct annual algorithmic impact assessments, and other entities to do design evaluations of their algorithms.
  • Service Providers & Third Parties: In contrast to State laws and the GDPR, which use contractual requirements to control data use by service providers and third parties, the ADPPA regulates these entities directly. Service providers may only use data to perform services on behalf of covered entities, must promptly delete it thereafter, and may only transfer data to third parties with the affirmative express consent of the relevant individual (obtained via the covered entity). Third parties may not process data obtained from another entity contrary to individuals’ reasonable expectations.
  • Federal and State Enforcement: The bill authorizes FTC and State AG enforcement and sets forth a coordination scheme to prevent them from bringing duplicative actions. It also directs the FTC to establish a new Privacy Bureau; gives the FTC a wide array of rulemaking authority (sprinkled throughout the law); and authorizes it to approve compliance programs that “meet or exceed” the bill’s requirements.
  • Private Right of Action: The provision granting the private right of action is fairly complex, clearly reflecting extensive negotiations. It allows a person or class who suffers an injury due to a violation to bring a civil action in federal court but imposes a four-year delay until the PRA kicks in; bans mandatory arbitration clauses for minors only; and limits relief to compensatory damages, injunctions, and reasonable attorneys’ fees and costs. For certain PRAs (those that seek an injunction, or that target smaller companies), there’s a 45-day right to cure. Litigants also must notify the FTC and relevant State AG in advance, who may bring their own actions (but not, it appears, halt the private action). Finally, the bill directs the FTC to study the impact of the PRA on the economy.
  • State Preemption: The preemption provision is similarly complex. It broadly preempts state laws that address the same issues as in the federal bill, and then claws back (i.e., preserves) particular laws or types of laws, including California’s data breach PRA (not the whole law, as has been mistakenly reported); Illinois’ biometric and genetic privacy laws; employee and student privacy laws; laws that solely address facial recognition, surveillance, wiretapping, or phone monitoring; state breach notification laws; and a range of general purpose laws.
  • Other Federal Laws: The bill generally preserves sector-specific privacy and data security laws like COPPA, GLBA and FERPA. One notable exception is that the bill prevents the FCC from using the Communications Act, or any rule issued under it, to take action against any covered entity for privacy violations. (Bracketed language would narrow the scope of this provision to satellite carriers, cable operator, or broadband providers.) The bill thus ousts the FCC of privacy jurisdiction in favor of the FTC, a move that some telecom groups have supported for years.

What’s Next?

As we write this post, House Commerce has just announced that it will hold a hearing on the ADPPA on June 14, and we’ve heard that the Senate may hold a privacy hearing on the same day. However, time is short in this election year and Senator Cantwell (who chairs Senate Commerce) still supports her own bill, not the ADPPA, arguing that the PRA is too limited (even as industry members say it’s too broad). Still, the bill has a chance; it’s earned its “breakthrough” moniker; and if it doesn’t pass this year, it will frame discussions moving forward.

Stay tuned as we continue to track progress on this bill.

* * *

Kelley Drye Unveils First-of-its-kind Advertising Law App

Download our free App – Ad Law Access – a first-of-its kind, one-stop portal that provides updates and analysis on advertising, marketing, and privacy/data security law. The App is now available in the Apple App Store and Google Play, and can be used on iPhone, iPad, and Android devices.

]]>
Ten Percent and Rising: Connecticut Becomes Fifth U.S. State to Enact Privacy Law https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ten-percent-and-rising-connecticut-becomes-fifth-u-s-state-to-enact-privacy-law https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ten-percent-and-rising-connecticut-becomes-fifth-u-s-state-to-enact-privacy-law Wed, 11 May 2022 17:25:39 -0400 On Tuesday, Connecticut became the fifth state to pass comprehensive privacy legislation when Governor Ned Lamont signed “An Act Concerning Personal Data Privacy and Online Monitoring” into law. Connecticut joins California, Virginia, Colorado, and Utah in enacting new privacy laws that take effect in 2023. Out of fifty states in the U.S., ten percent have now passed a comprehensive privacy law.

Effective July 1, 2023, the Connecticut law adopts a general framework of definitions, consumer rights, and compliance obligations based on concepts of data controller and data processor from the EU’s General Data Protection Regulation (GDPR), and the right to opt out of the “sale” of personal data as first articulated in the California Consumer Privacy Act (CCPA). Overall, the Connecticut law mirrors Colorado’s privacy law but then borrows select concepts from the California, Virginia, and Utah laws. The result is a hybrid of the pre-existing state laws, but not a law that introduces significant contradictions or unique compliance challenges.

The following are highlights of the Connecticut law:

  • Applicability. Consistent with other state laws, Connecticut’s privacy law applies to companies that do business in Connecticut and meet certain thresholds. In order to apply, the company must process personal data of more than 100,000 Connecticut consumers, or process personal data of 25,000 Connecticut consumers but also derive a significant percentage of income from the “sale” of personal data – 25 percent. The law does not apply to non-profits.
  • Federal Law Exemptions. Similar to other state laws, Connecticut exempts Covered Entities or Business Associates subject to HIPAA and Financial Institutions or data subject to the Gramm-Leach-Bliley Act. The law also exempts certain activities of consumer reporting agencies and furnishers and users of consumer reports where regulated by the Fair Credit Reporting Act.
  • Employee and B2B Exceptions. Similar to the Colorado, Virginia, and Utah laws, the Connecticut law does not apply to personal data of employees or individuals acting in a commercial context. California’s employee and B2B contact data exemptions are set to sunset at the end of 2022, although legislation is pending to extend these exemptions.
  • Opt-Out of Sale and Targeted Advertising. The Connecticut law provides a right to opt-out of the sale of personal data. The definition of a “sale” is “the exchange of personal data for monetary or other valuable consideration by the controller to a third party,” which is similar to the definition in the Colorado privacy law. The Connecticut law also provides for a right to opt-out of processing of personal data for purposes of targeted advertising. Similar to Colorado, “targeted advertising” is defined as “displaying advertisements to a consumer where the advertisement is selected based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated internet web sites or online applications to predict such consumer’s preferences or interests.” The definition excludes targeting advertisements based on activities within a business’s own websites or apps.
  • Consent for Sales and Targeted Advertising Involving Minors: For minors at least 13 years of age and younger than 16 years of age, a business may not process data for targeted advertising or sell the minor’s data without consent. The minor has a right to revoke consent in a manner that is “at least as easy as the mechanism by which the consumer provided the consumer’s consent.” This provision is similar to the California right to opt-in to the sale or sharing of a minor’s personal information.
  • Consent to Process Sensitive Data. The Connecticut law requires consent to process sensitive data, similar to the Virginia and Colorado laws. The law defines sensitive data to include data revealing racial or ethnic origin, religious beliefs, mental or physical health conditions or diagnoses, sex life, sexual orientation, and citizenship and immigration status; genetic and biometric data that identifies an individual; and precise geolocation data. This definition is similar to the Virginia law; in contrast, the Colorado law does not include precise geolocation data within its definition of “sensitive data.”
  • Children’s Data. Personal data collected from a known child is considered to be sensitive data. The Connecticut law requires processing this data in accordance with the federal Children’s Online Privacy Protection Act (COPPA).
  • Consumer Rights. The Connecticut law provides consumers the right to access their personal data in a portable format, delete their personal data, and correct inaccuracies in their personal data.
  • Contract Terms. The Connecticut law includes a laundry list of terms that must be added to contracts between controllers and processors. These terms, which overlap with contract requirements in the other state privacy laws, include instructions on processing data, duty of confidentiality, deletion at the end of provision of services, requiring a processor to provide evidence of compliance to a controller, the opportunity to object to sub-processors, and controller audit rights.
  • Enforcement and Regulation. The Connecticut law does not include a private right of action or rulemaking process. There is a 60 day right to cure violations until Jan. 1, 2025. The law is the first to omit an express penalty per violation. Instead, a violation constitutes an unfair trade practice for purposes of Connecticut’s Consumer Protection Law, Section 42-110b. The Consumer Protection Law, in turn, grants the Connecticut Attorney General the right to seek injunctive relief and $5,000 statutory penalties only for willful misconduct.
The following chart summarizes and compares requirements of the five U.S. state privacy laws (mobile users click here):
California Privacy Rights Act (CPRA) Colorado Privacy Act (CPA) Virginia Consumer Data Protection Act (VCDPA) Utah Consumer Privacy Act (UCPA) Connecticut Act Concerning Personal Data Privacy (CPDP)
Effective Date January 1, 2023 July 1, 2023 January 1, 2023 December 31, 2023 July 1, 2023
Thresholds to Applicability

Conducts business in CA,

Determines the purposes and means of processing personal info. of CA residents, and

Meets one of the following thresholds:

>$25 million in annual revenue in the preceding year,

Buys/sells personal info. of > 100K consumers or households, or

Earns > 50% of annual revenue from selling or sharing personal info.

Conducts business in CO or targets products or services to CO residents, and

Meets either of these thresholds:

Processes personal data of > 100K consumers in a year; or

Earns revenue or receives a discount from selling personal data and processes personal data of >25K consumers.

Conducts business in VA or targets products or services to VA residents; and

Meets either of these thresholds:

Processes personal data of > 100K consumers; or

Processes personal data of >25K consumers and derives >50% of gross revenue from the sale of personal data.

Conduct business in Utah or target products or services to Utah residents,

Have more than $25 million in annual revenue, and

Either:

During a calendar year processes personal data of >100K consumers, or

Process personal data of > 25K consumers and derive > 50% of revenue from the sale of personal data.

Produce products or services that are targeted to CT residents, and

In the preceding year:

Process personal data of >100K consumers (excluding payment transaction data), or

Process personal data of > 25K consumers and derive > 25% of revenue from the sale of personal data.

Sales

Right to opt-out of the sale of personal information.

Opt-in consent required to “sell” personal information of minors under age 16.

Right to opt-out of the sale of personal data. Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration. Right to opt-out of the sale of personal data. The definition of a “sale” requires monetary consideration.

Right to opt-out of the sale of personal data.

Opt-in consent required to “sell” personal data of minors 13 to 16.

Targeted Advertising

Right to opt-out of the “sharing” of personal information for purposes of cross-context behavioral advertising.

Opt-in consent required to “share” personal information of minors under age 16.

Right to opt-out of targeted advertising Right to opt-out of targeted advertising Right to opt-out of targeted advertising

Right to opt-out of targeted advertising

Opt-in consent required for processing personal data of minors 13 to 16 for targeted advertising.

Global Privacy Control Yes (optional subject to regulatory process) Yes, required by July 1, 2024. No No Yes, required by Jan. 1, 2025.
Sensitive Data Right to limit the use and disclosure of sensitive personal information. Consent to process sensitive data. Consent to process sensitive data. Provide notice and an opportunity to opt out of processing of sensitive data. Consent to process sensitive data.
Profiling Pending regulations Right to opt-out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. Right to opt-out of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer. N/A Right to opt-out of profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer.
Minor & Children’s Data Opt-in consent required to “sell” or “share” personal information of minors under age 16. COPPA exception; obtain parental consent to process personal data concerning a known child. Process sensitive data of a known child in accordance with COPPA. Process personal data of a known child in accordance with COPPA.

Process sensitive data of a known child in accordance with COPPA.

Consent to sell personal data of minors 13 to 16 or process their personal data for targeted advertising.

Consumer Rights Access, Deletion, Correction, Portability Access, Portability, Deletion, Correction Access, Portability, Deletion, Correction Access, Portability, and Deletion Access, Deletion, Correction, Portability
Authorized Agents Permitted for all consumer rights requests Permitted for opt-out requests N/A N/A Permitted for opt-out requests
Appeals N/A Must create process for consumers to appeal refusal to act on consumer rights Must create process for consumers to appeal refusal to act on consumer rights N/A Must create process for consumers to appeal refusal to act on consumer rights
Private Right of Action Yes, for security breaches involving certain types of sensitive personal information No No No No
Cure Period 30-day cure period is repealed as of Jan. 1, 2023. 60 days until provision expires on Jan. 1, 2025. 30 days 30 days

60 days until provision expires on Dec. 31, 2024.

Starting Jan. 1, 2025, AG may grant the opportunity to cure.

Data Protection Assessments Annual cybersecurity audit and risk assessment requirements to be determined through regulations. Required for targeted advertising, sale, sensitive data, certain profiling. Required for targeted advertising, sale, sensitive data, certain profiling. N/A Requires for targeting advertising, sale, sensitive data, certain profiling.

]]>
Privacy Priorities for 2022: Tracking State Law Developments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-priorities-for-2022-tracking-state-law-developments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-priorities-for-2022-tracking-state-law-developments Fri, 25 Mar 2022 13:10:13 -0400 The replay for our April 28, 2022 Privacy Priorities for 2022: Tracking State Law Developments webinar is available here.

In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year. Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law. Aaron Burstein, Laura VanDruff and Paul Singer, presented this webinar to help learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.

Kelley Drye Unveils First-of-its-kind Advertising Law App
]]>
Lina Khan’s Privacy Priorities – Time for a Recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/lina-khans-privacy-priorities-time-for-a-recap Wed, 16 Mar 2022 11:47:07 -0400 Lina Khan’s Privacy Priorities – Time for a Recap

Rumors suggest that Senator Schumer is maneuvering to confirm Alvaro Bedoya as FTC Commissioner sooner rather than later, which would give FTC Chair Khan the majority she needs to move forward on multiple fronts. One of those fronts is consumer privacy, for which Khan has announced ambitious plans (discussed here and here) that have stalled for lack of Commissioner votes. With Bedoya potentially on deck, now seems like a good time to recap those plans, as they might provide clues about what’s in the pipeline awaiting Bedoya’s vote. We focus here on three priorities Khan has emphasized in statements and interviews since becoming Chair.

Privacy Rulemakings

At the top of the list are privacy rulemakings, which could create baseline standards for the entire marketplace and enable the FTC to obtain monetary relief in its cases. (Recall that the FTC has limited authority to obtain money in its cases, especially post AMG, but that it can seek penalties or redress when it’s enforcing a rule.) Last December, Khan issued a Statement of Regulatory Priorities detailing the privacy rulemakings she wants to initiate or complete, including:

  • New rules to halt “abuses stemming from surveillance-based business models,” which could curb “lax security practices” and “intrusive surveillance,” “ensur[e] that algorithmic decision-making does not result in unlawful discrimination,” and potentially limit the use of “dark patterns” to manipulate consumers. (Yes, this is an ambitious one.)
  • Possible amendments to existing privacy rules – including the Children’s Online Privacy Protection Act (COPPA), the Health Breach Notification Rule, the Safeguards Rule (breach notification requirements), and the FACTA Identity Theft Rules (including the Red Flags Rule).
  • Possibly other new rules to “define with specificity unfair or deceptive acts or practices.”

Of note, absent Congressional legislation, any new privacy rules would need to follow the arduous process detailed in Section 18 of the FTC Act (referred to as “Mag-Moss” rulemaking). With Bedoya on board, the FTC can start these rulemakings, but they could still take years to complete, as we discuss here.

By contrast, the FTC can amend its existing privacy rules under the more manageable Administrative Procedures Act. Further, it’s already in the midst of rule reviews for all of the rules listed above (including COPPA’s, which started back in 2019). As a result, the FTC could act on these rules relatively quickly once Bedoya is on board.

Focus on Platforms

Khan has also made clear that she intends to focus on the tech platforms – which she has described as “gatekeepers” that use their critical market position to “dictate terms,” “protect and extend their market power,” and “degrade privacy without ramifications.” In a statement and accompanying staff report last September, Khan stated that such efforts would include:

  • Additional compliance reviews of the platforms currently subject to privacy orders (Facebook, Google, Microsoft, Twitter and Uber), followed by order modifications and/or enforcement as necessary.
  • As resources permit, examining the privacy implications of mergers, as well as potential COPPA violations by platforms and other online services – COPPA being of special importance as children have increasingly relied on online services during the pandemic. (Relatedly, report language accompanying the omnibus budget just signed into law directs the FTC to prioritize COPPA enforcement.)
  • Completion of the pending Section 6(b) study of the data practices of the social media companies and video streaming services, which was initiated in December 2020.

So far, we’ve seen limited action from the FTC on platforms (at least on the consumer protection side). Last October, the FTC issued a 6(b) report on the privacy practices of ISPs, but largely concluded that the topic should be addressed by the FCC. Then, in December, the FTC announced a settlement with online ad platform OpenX for COPPA violations. Given Khan’s bold plans in this area, it seems likely that there are matters in the pipeline awaiting Bedoya’s vote.

Stronger Remedies

The third major area that Khan has highlighted is obtaining stronger remedies in privacy cases – that is, considering “substantive limits”, not just procedural protections that “sidestep[] more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.” By this, Khan is referring to deletion of data and algorithms, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and monetary remedies based on a range of theories post AMG.

As to this priority, the FTC has moved ahead where it can (even prior to Khan’s tenure), often using strategies that have been able to garner unanimous votes. For example, its settlements with photo app Everalbum (for alleged deception) and WW International (for alleged COPPA violations) required deletion of consumer data and algorithms alleged to have been obtained illegally. Its settlement with fertility app Flo Health (for alleged deception about data sharing) required the company to notify affected consumers and instruct third parties that received their data to destroy it. The FTC also has alleged rule violations where possible, and partnered with other agencies to shore up its ability to obtain monetary relief.

But we’ve also seen signs of a more combative approach that could increase when Khan has the votes to push it forward. Of note, last September, the FTC issued an aggressive interpretation of the Health Breach Notification Rule, purporting to extend the rule’s reach (and thus its penalties) to virtually all health apps, even though a rule review was already underway. Further, FTC staff are making strong, often unprecedented demands for penalties, bans, and individual liability in consent negotiations. It’s even possible, based on an article written by former Commissioner Chopra and now-BCP Director Sam Levine, that the agency could attempt to use penalty offense notice letters (explained here) to lay the groundwork for penalties in privacy cases under Section 5(m)(1)(B). However, given the paucity of administratively litigated privacy cases (a key requirement under 5(m)(1)(B)), this would be very aggressive indeed.

* * *

For more on Khan’s privacy plans, you can read our earlier blogposts (here and here), as well as the various FTC statements and reports cited in this post. Or, if you like surprises, you can simply wait for Bedoya to be confirmed and see what happens. Needless to say, things should speed up at the FTC when he arrives.

Privacy Priorities for 2022: Tracking State Law Developments Thursday, March 24, 2022 at 4:00pm ET/ 1:00pm PT Register Here

In the absence of a federal privacy law, privacy has been at the forefront of many states’ legislative sessions this year:

  • Utah is poised to be the fourth state to enact comprehensive privacy legislation
  • Florida came close to passing legislation when the State House advanced privacy legislation by a significant margin
  • Other state legislatures have privacy bills on their calendars

Against this backdrop, state attorneys general continue to initiate investigations into companies’ privacy practices, and state agencies continue to advance privacy rulemakings under existing law.

Please join us on Thursday, March 24 at 4:00 pm ET for this webinar to learn about the latest developments in state privacy law, make sense of these developments and understand their practical impact.

]]>
Webinar Replay: Privacy Priorities for 2022 - FTC https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-privacy-priorities-for-2022-ftc https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-privacy-priorities-for-2022-ftc Fri, 25 Feb 2022 15:09:21 -0500 The replay for our May 19, 2022 Privacy Priorities for 2022 webinar is available here.

Under Chair Lina Khan, the Federal Trade Commission has announced an aggressive privacy agenda, which is unfolding on the enforcement, regulatory, and policy fronts. In recent enforcement actions, the FTC has sought stringent remedies, including data deletion, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and significant monetary relief based on a range of creative theories. The FTC has also announced that it intends to launch a rulemaking to limit "surveillance advertising." The FTC has also issued two rounds of guidance on its Health Breach Notification Rule -- which has never been the subject of an FTC enforcement action and is the subject of an open rulemaking proceeding.

To help make sense of these developments -- and understand their practical impact -- Aaron Burstein and Jessica Rich took a deep look at these key recent developments and put them in the context of the FTC's recent challenges and setbacks.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

Keep up with all things Ad Law through the Ad Law Access App, now available as a free download in the Apple App Store and Google Play, and can be used on iPhone, iPad, and Android devices.

Ad Law Access App
]]>
Top Privacy Issues to Watch in 2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-privacy-issues-to-watch-in-2022 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/top-privacy-issues-to-watch-in-2022 Wed, 19 Jan 2022 21:28:04 -0500 Top Privacy Issues to Watch in 2022You’ve probably seen a lot of privacy forecasts for 2022 during the past few weeks. Here’s one that reflects the collective thoughts of our diverse privacy team, which includes former high level officials from the FTC and State AG offices, and practitioners who have been advising clients about privacy for over 30 years.

Note: Our team will discuss these issues, along with practical suggestions for how companies can tackle privacy challenges, in a January 26 webinar at 4 pm ET. Please tune in! You can register here.

  • State privacy developments will continue to drive much of the U.S. privacy debate.
    • California and Colorado will launch rulemakings to implement their laws, setting an example for other jurisdictions and prompting industry changes even beyond their borders. Meanwhile, companies will be gearing up for the effective dates of all three state laws (January 1, 2023 for California and Virginia, and July 1, 2023 for Colorado).
    • With multiple bills already pending in other states, we may see additional state laws by year’s end. Draft bills introduced thus far suggest a range of approaches that vary from existing laws, suggesting compliance may become even more complex in the coming year.
    • Even states without comprehensive privacy laws will seek to use their “unfair and deceptive” trade practice authority in increasingly creative ways to address privacy. A recent example is Arizona’s effort to challenge Google’s collection and use of location data.
  • The FTC will pursue an aggressive privacy agenda, pushing the boundaries of its legal authority and seeking to move the goalposts governing data collection, use, and sharing.
    • It will launch a broad “surveillance” rulemaking under its Magnuson-Moss procedures, seeking strict limits on personalized advertising, lax security practices, and algorithmic discrimination. (As we discuss here, though, the rule will likely take years to complete.)
    • It will increase enforcement of sectoral privacy laws and rules (e.g., FCRA, COPPA, GLB Privacy, Red Flags), so it can get monetary relief, post AMG. It also will try to obtain settlements for alleged violations of the Health Breach Notification Rule – which it “clarified” in a 2021 policy statement covers virtually all health apps.
    • It will focus on tech platforms and other large companies, through both aggressive enforcement and high-profile studies, such as its upcoming report on social media companies.
    • In all of its privacy cases, the FTC will seek stringent remedies, including data deletion, bans on conduct, notices to consumers, stricter consent requirements, individual liability, and significant monetary relief based on a range of creative theories. (See our scorecard on the FTC’s use of such theories here.)
  • Other federal agencies will flex their muscles on privacy and data security, scrutinizing and regulating companies within their areas of jurisdiction.
    • For example, the CFPB recently ordered the tech giants to turn over information regarding the data practices of payments systems they operate. The FCC just moved to update breach reporting requirements under the CPNI rules. And the SEC just fined eight broker-dealers and investment companies for their “deficient cybersecurity procedures.”
    • Expect these types of actions to accelerate in the coming year, as privacy continues its ascent as a top regulatory, consumer protection, and risk management issue.
  • Developments in and around the tech platforms will continue to have ripple effects across the entire marketplace.
    • The tech platforms (yeah, them again) will continue to tighten their rules governing data sharing, third-party cookies, use of identifiers, and access to their platforms, forcing other companies to develop new ways to market their brands.
    • “Big tech” antitrust challenges will advance through legislatures and the courts, requiring policymakers and enforcers to finally confront the tension between competition interests (which seek to expand access to data) and privacy interests (which seek to limit access).
  • Cross border data transfers will become ever more difficult, as Privacy Shield remains unresolved and the EU accelerates GDPR enforcement.
    • For example, Austria’s DPA recently held that Google Analytics violated the GDPR when it transferred to the U.S. EU citizens’ IP address and identifiers in cookie data, notwithstanding Google’s claim that it had protective measures in place.
    • Further, the record fines being obtained for GDPR violations (a reported seven-fold spike in 2021) will increase the peril for multinational companies that transfer data as part of their operations.
  • The plaintiff’s bar will continue to test the limits of addressing privacy in private litigation, despite some setbacks in 2021.
    • The setbacks include the high bar set by the Supreme Court regarding the proof of harm necessary to confer standing in privacy cases. In addition, neither Virginia nor Colorado included a private right of action in their comprehensive privacy laws.
    • However, the California law includes a private right of action for data breaches, and pending legislative proposals in other states include private rights of action for privacy, security, or both. Plaintiffs also are employing other statutory frameworks to address privacy, such as the contract laws cited in the recent class action against Zoom, and the call recording laws cited in session-replay lawsuits.
  • Congress will continue to debate whether to pass a federal privacy law.
    • Yes, it’s safe to assume that the never-ending debate will continue! The harder question is whether Congress will finally pass anything.
    • It’s possible. Businesses have never wanted a federal privacy law more – to deal with the specter of more state privacy laws, “overreach” by the FTC, the EU’s heightened enforcement efforts, and the overall confusion created by fragmented privacy regimes (i.e., all of the issues discussed above).
    • The more likely scenario, however, is that Congress will pass something narrower, like a bill to amend COPPA or provide new privacy protections for teens, which could be an area of consensus among Democrats and Republicans. (Another possibility, just proposed by some Democrats, is legislation to ban “surveillance advertising,” similar to the rule that the FTC is planning. However, that would likely be a much more divisive issue in Congress.)
Privacy remains at the forefront in 2022. In our January 26 webinar, we will help you think about what to monitor and what to prioritize. Please join us, and feel free to send us a note if you have questions that you’d like us to address in the webinar.

]]>
State Attorneys General 2022 Predictions https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-attorneys-general-2022-predictions https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/state-attorneys-general-2022-predictions Mon, 10 Jan 2022 17:56:59 -0500 State Attorneys General 2022 Predictions

State Attorneys General are already off to the races in 2022 – both with a significant number of election campaigns in full swing and an uptick in their consumer protection enforcement efforts. As a result, State AG consumer protection topics will play a big part of 2022. Our Kelley Drye State Attorneys General team will present a webinar on these State AG priorities on January 27. In the meantime, we provide a snapshot of what’s to come this year.

A unique year will bring extra attention to state consumer protection enforcement. State Attorneys General are in a prime position to dominate the consumer protection news of 2022. With over 30 AG seats up for election or appointment and an increased use of broad consumer protection laws to promote social agendas, expect consumer protection topics to be a focal point of some campaigns. Add to that the FTC’s increased reliance on State AGs to secure monetary relief following the Supreme Court’s 2021 AMG decision, AGs complaining that investigations are moving too slowly, and an increased desire to publicize AG investigations, and you should expect to hear a lot more about AG actions this coming year.

New laws will lead to new privacy and marketing priorities. Keeping up with new privacy laws is on the forefront of businesses’ minds these days, with significant new laws in California, Colorado, and Virginia. Not only are states involved in rulemaking and implementation of those laws, but they will be under pressure to quickly bring enforcement proceedings. New marketing laws, especially in the area of subscription services in states like Delaware and Colorado, will have a similar effect. Expect to see other states trying to keep up by stepping up enforcement under existing state and federal authority, while their legislatures consider new state-specific laws.

Technology and public health issues will remain a focus for Attorneys General. 2021 saw a number of enforcement actions by Attorneys General combatting “big tech,” both under antitrust and general consumer protection laws. For instance, 33 State Attorney General recently weighed in on the CFPB’s November Notice and Request for Comment on Big Tech Payment Platforms, highlighting their unique position to discuss concerns regarding consumer complaints on a range of issues. In addition, State AGs have been at the forefront of addressing public health crises, notably regarding marketing practices in the opioids and vaping industries. These efforts will not only remain priorities, but can expect increased attention in 2022 as State AGs expand their efforts -- including along party lines, with topics such as content moderation and climate change.

Proactive partnership opportunities will be plentiful. One notable theme that emerged at the end of 2021 was a renewed call by Attorneys General for the corporate world to partner with them in combatting intractable frauds, especially those that have become more pervasive through new technology. With NAAG President Attorney General Miller’s Presidential Initiative of “Consumer Protection 2.0: Tech Threats and Tools” comes opportunity for business to engage with the AG community to educate consumers about scams and innovate to benefit consumers and businesses alike.

And of course, all of these priorities can change at the drop of a hat when emergency situations call for immediate reaction. For example, the recent Omicron surge has led to additional warnings of price gouging on COVID tests and fake test kits, areas you can expect the Attorneys General to aggressively pursue in early 2022. It is more important than ever to watch the State AG actions and keep up with their enforcement priorities, and we hope you can join us on January 27 to hear more from our team.

* * *

As mentioned above, please join Kelley Drye State Attorneys General practice Co-Chair Paul Singer, Advertising and Marketing Partner Gonzalo Mon, Privacy Partner Laura VanDruff, Senior Associate Beth Chun for State Attorney General Consumer Protection Priorities for 2022. This webinar will provide discussion and practical information on the topics mentioned above and other state consumer protection, advertising, and privacy enforcement trends. Register here.

Also join us for Privacy Priorities for 2022: Legal and Tech Developments to Track and Tackle, a joint webinar between Kelley Drye’s Privacy Team and Ketch, a data control and programmatic privacy platform. This Data Privacy Week webinar will highlight key legal and self-regulatory developments to monitor, along with practical considerations for how to tackle these changes over the course of the year. This will be the first in a series of practical privacy webinars by Kelley Drye to help you keep up with key developments, ask questions, and suggest topics that you would like to see covered in greater depth. Register here.

]]>
New Mexico Attorney General Settles Google Children’s Privacy Cases: A Unique Settlement Adds to a Complicated Landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-mexico-attorney-general-settles-google-childrens-privacy-cases-a-unique-settlement-adds-to-a-complicated-landscape Thu, 16 Dec 2021 15:38:52 -0500 On December 13, the New Mexico Attorney General announced a settlement with Google to resolve claims regarding children’s privacy, including in the burgeoning EdTech space. The federal lawsuits Balderas v. Tiny Lab Productions, et al. and Balderas v. Google LLC, respectively, alleged COPPA and privacy violations related to collection of children’s information on game developer Tiny Lab’s apps and on Google’s G Suite for Education products. There are many features of this settlement that are worth discussing further as either potential future trends, or novel provisions.

Privacy Compliance Provisions

New Mexico’s injunction related to the Tiny Lab case includes changes to Google Play which will take effect after 120 days. Some of the specific measures include:

  • revising Google Play Families policies and including additional help pages to assist app developers in compliance;
  • requiring all developers to complete a form to indicate the targeted age group of apps;
  • using a rubric to evaluate app submissions to help determine whether it appeals to kids and check for consistency with the age group form;
  • requiring Families apps to certify they will comply with COPPA;
  • requiring all apps to only use SDKs that certify compliance with Google’s policies including COPPA;
  • requiring developers of Families apps to disclose collection of any children’s data including through third parties;
  • requiring a link to the app’s privacy policy on the Google Play store page; and
  • communicating whether an app is Child Directed to AdMob and AdMob will then follow COPPA pertaining to that data.
The content of the help pages the injunction requires do not just contain answers to frequently asked questions. They prescribe certain decisions by and limitations on third parties using the Google Play store. For example, Exhibit 3 to the injunction provides “if you serve ads in your app and your target audience only includes children, then you must use Google Play certified SDKs.”

In addition to these injunctive provisions, Google agreed to a set of voluntary enhancements to the Google Education platform intended to promote safety for students. New Mexico’s enforcement of these provisions is limited to its ability to confirm that Google has made the changes, or inquire as to the status of changes not made.

These injunctions demonstrate continued state Attorney General scrutiny regarding children’s information. And they come at a time that the Federal Trade Commission, which is responsible for issuing the COPPA Rule, is redoubling its COPPA efforts. The FTC’s ongoing COPPA Rule Review includes a number of questions regarding the intersection of COPPA and education technology. The FTC’s Statement of Regulatory Priorities, which we wrote about here, identifies COPPA as a top priority. And just this week, the FTC released its first COPPA settlement in almost 18 months.

Additional Settlement Terms Part from Historical State Settlements

Not to be ignored, several other provisions of the settlement have unique aspects that are extremely noteworthy. Google has agreed to pay New Mexico $5.5 million – with $1.65 million of that going to outside counsel for the state. The remaining payment will be used to fund the “Google New Mexico Kids Initiative” – a program jointly run by Google and New Mexico to award grants to schools, educational institutions, charitable organizations, or governmental entities. This unique allocation of the payment to the State could result in scrutiny that other State Attorney General settlements have met in the past where they attempted to designate funds to specific third party recipients. Some state legislatures may see it as an effort to appropriate funds without their involvement.

While New Mexico reserves its rights under the agreement regarding public statements, it has agreed to provide Google 24-hour notice before making any written public statement. Moreover, New Mexico agrees to consider in good faith any suggestions or input Google has, and any statement will reference the parties’ shared commitment to innovation and education. States routinely resist any efforts to negotiate press in this manner, and it is unclear how enforceable a provision like this could really be anyway. That said, this certainly reflects the cooperative nature of the agreement, in which case it’s fair to assume the State would issue press reflecting such cooperation anyway.

Google and New Mexico have also agreed to an ADR provision, requiring the state to pursue any disputes relating to the agreement in mediation prior to pursuing relief. This again is fairly unique for a State AG settlement, as is the overall form of the document (a “Settlement Agreement and Release”) – normally states will only settle matters through a consent judgment or a statutorily authorized Assurance of Compliance or Discontinuance. But just like some of the other unique provisions, agreeing to ADR may be more of a reflection of the cooperative nature of the agreement, and certainly presents opportunity for a more streamlined enforcement mechanism in the future.

It remains to be seen if these provisions will serve as a template for future state agreements with other companies, but given that state Attorneys General continue to pursue Google on a variety of fronts[1], New Mexico’s settlement will certainly be relevant in any future settlement efforts.

[1] Google Search Manipulation, Google Ad Tech, Google DOJ Search Monopoly, State of Arizona v. Google LLC geolocation privacy

]]>
Senate Hearing on Promoting Competition and Privacy in the Tech Sector: Two Hearings in One? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-hearing-on-promoting-competition-and-privacy-in-the-tech-sector-two-hearings-in-one https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-hearing-on-promoting-competition-and-privacy-in-the-tech-sector-two-hearings-in-one Wed, 15 Dec 2021 10:28:07 -0500 On December 7, 2021, the Senate Finance Committee’s Subcommittee on Fiscal Responsibility and Economic Growth conducted a hearing on “promoting competition, growth, and privacy protection in the technology sector. The hearing could have been conducted using a split-screen format, since one group of Senators and witnesses focused on anti-competitive behavior by the tech giants and another focused on privacy and security concerns raised by data brokers.

Chair Elizabeth Warren (leader of the first group) said Congress should provide better tools to the FTC to break up companies like Amazon and, along with other senators, questioned witnesses on how monopolies hurt workers and the economy. Ranking member Bill Cassidy (leader of the second) focused on the threats of the unregulated data brokerage industry, and the need for comprehensive federal privacy legislation to protect consumer privacy and national security.

Here’s what the competition witnesses said (in brief):

  • Courtenay Brown, Amazon Associate at Avenel, New Jersey’s Fulfilment Center and Member Leader with United for Respect, focused on the need to hold Amazon accountable for the poor working conditions she and other employees face daily.
  • Karl A. Racine, Attorney General for the District of Columbia, described the lawsuit his office is pursuing against Amazon for unfairly and unlawfully increasing prices on Amazon’s website, stifling competition, and taking advantage of consumers.
  • Barry C. Lynn, Executive Director, Open Markets Institute, discussed the monopolistic practices of tech giants such as Google, Facebook, and Amazon, which, according to Lynn, (1) perpetuate low wages, high prices, sharp declines in entrepreneurship, and political extremism, and (2) have caused the supply chain problems that are now occurring across the world.

Here’s what the data broker witnesses said (in brief):

  • Justin Sherman, from the Data Brokerage Project at Duke’s Sanford School of Public Policy, explained that data brokerage is a “virtually unregulated practice” in the United States that enables advertisers and businesses to target marginalized communities and allows foreign governments to compile sensitive personal data with few controls. He proposed three steps Congress should take immediately: (1) strictly control data broker sales to foreign companies, citizens, and governments; (2) strictly control the sale of sensitive information, such as genetic, health, and location data; and (3) stop data brokers from circumventing controls by “inferring” data.
  • Samm Sacks, from Yale Law School’s Paul Tsai China Center and the New America Foundation, focused on data security in the context of the U.S.-China relationship, cross-border data flows, and national security. Noting that the lack of comprehensive data privacy regulation in the United States makes our data vulnerable to both sophisticated state actors and unregulated data brokers, Sacks advocated for enactment of a federal law setting basic standards for all companies, restricting data broker practices, and limiting exports of personal data to foreign countries. She cautioned, however, against simply “cutting and pasting” the GDPR, which can end up serving only the companies that are wealthy enough to bear the burdens and costs of compliance. Sacks also supported mechanisms to facilitate cross-border data flows with likeminded countries (subject to appropriate controls), since U.S. security and prosperity rely on international cooperation with allies.
  • Stacey Gray, from the Future of Privacy Forum, recommended that Congress pass baseline privacy legislation that establishes clear rules for both data brokers and first-party companies that process personal data. She also outlined more incremental steps that Congress could take, such as establishing a national registry or opt out, or limiting the ability of law enforcement and intelligence agencies to purchase information from data brokers (as required in proposed legislation from Senator Wyden). In addition, Gray recommended that Congress strengthen the FTC by increasing its staff and funding, establishing a privacy bureau, and authorizing civil penalty authority.

Like many hearings on these issues, this one examined the issues but did not appear to be leading to any particular legislative solution. We still are still left to wonder: (1) Will Congress finally be able to negotiate and pass comprehensive federal privacy legislation? (2) What will Congress do to address the many concerns that have been raised about the power of the tech giants, and how will Congress choose which issues to prioritize?

We will continue to monitor developments on this issue and provide updates as they occur.

]]>
NTIA’S Listening Sessions On Privacy and Civil Rights: What's the Significance? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ntias-listening-sessions-on-privacy-and-civil-rights-whats-the-significance https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ntias-listening-sessions-on-privacy-and-civil-rights-whats-the-significance Mon, 06 Dec 2021 20:07:45 -0500 In case you missed it, last week (on November 30), the National Telecommunications and Information Administration (NTIA) announced that it would convene a series of virtual listening sessions on privacy, equity, and civil rights. According to NTIA, the sessions (scheduled for December 14, 15, and 16) will provide data for a report on “the ways in which commercial data flows of personal information can lead to disparate impact and outcomes for marginalized or disadvantaged communities.”

NTIA cites the following examples to illustrate how data collection, “even for legitimate purposes,” leads to disparate impacts:

  • Digital advertising offers content and opportunities based on proxy indicators of race, gender, disability and other characteristics, perpetuating historical patterns of discrimination.
  • Insurance companies use information such as neighborhood, safety, bankruptcy, and gun ownership to infer who will need expensive health care, warranting higher premiums.
  • Universities predict which students will struggle academically based on factors that include race.

Why is this News?

As our readers may have noticed, NTIA is hardly the first agency or constituency to draw the link between data collection and discrimination. In 2013, Harvard Professor Latanya Sweeney published a groundbreaking study showing racial discrimination and stereotyping in online search and ad delivery. In 2014, FTC hosted a workshop, followed by a report (Big Data: A Tool for Inclusion or Exclusion?) detailing the problem and making recommendations for companies and researchers. In recent years, scores of studies and conferences have examined the discriminatory assumptions embedded in algorithms and artificial intelligence (AI). And civil rights groups have raised concerns for years and, in 2019, obtained an historic settlement with Facebook to stop discrimination on its online advertising platform.

NTIA’s announcement is nevertheless significant for two reasons. First, by its own description, NTIA is the President’s principal advisor on information policy issues, responsible for evaluating the impact of technology on privacy and the sufficiency of existing privacy laws. Further, its announcement states that the listening sessions are designed to “build the factual record for further policy development in this area.” For these reasons, the notice has been heralded as the Administration’s “first move” on privacy and a possible attempt to revive stalled efforts in Congress to enact a federal privacy law.

Second, in case there was any doubt, NTIA’s announcement affirms that the link between privacy and civil rights is now a widely accepted policy position, and will remain front-and-center in any debate about whether to enact a comprehensive federal privacy law. Whereas once there were questions about whether civil rights provisions should be “added” to a privacy law, now they’re essential building blocks.

This is true not only among Democrats, but among Republicans too. For example, provisions related to discrimination and/or algorithmic decision-making appear in recent privacy legislative proposals from, not just Representative Eshoo and Senator Cantwell, but also Senator Wicker and the Republican members of the House Energy and Commerce (E&C) Committee. The Republican E&C bill is especially notable for how much it leans into the issue – prohibiting data practices that “discriminate against or make an economic opportunity unavailable on the basis of race, color, religion, national origin, sex, age, political ideology, or disability or class of persons.”

But What Does this Mean for Companies Today?

You may be wondering – what does this mean for companies now, with Congress still (endlessly) debating whether to pass federal privacy legislation? It means that:

  • Data discrimination is on everyone’s radar, regardless of whether Congress finally decides to pass a federal privacy law.
  • Companies should expect more enforcement – even now, under existing laws – challenging data practices that lead to discriminatory outcomes. Such laws include the FTC Act (recently used to challenge racial profiling by an auto dealer), state UDAP laws, the Fair Credit Reporting Act, the Equal Credit Opportunity Act, and (of course) the civil rights laws.
  • To steer clear of discrimination (and any allegations of discrimination), companies should test their data systems and use of algorithms and AI for accuracy and fairness before using them in the real world.

We will continue to monitor developments on this issue and post updates as they occur.

NTIA’S LISTENING SESSIONS ON PRIVACY AND CIVIL RIGHTS: WHAT’S THE SIGNIFICANCE?
]]>
Paul Singer Joins Kelley Drye's State AG Practice https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/paul-singer-joins-kelley-dryes-state-ag-practice https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/paul-singer-joins-kelley-dryes-state-ag-practice Tue, 09 Nov 2021 10:45:09 -0500

Texas Associate Deputy Attorney General Paul Singer has joined Kelley Drye as a partner in the firm’s growing State Attorneys General practice group.We are thrilled to welcome Texas Associate Deputy Attorney General Paul Singer to the firm's growing State Attorneys General practice group. On the heels of former top Federal Trade Commission (FTC) officials Jessica L. Rich and Laura Riposo VanDruff joining the firm, Paul's addition further strengthens Kelley Drye's ability to help clients prepare for the future of consumer protection law, and to advise clients facing State Attorneys General investigations, as well as investigations brought by the FTC, Consumer Financial Protection Bureau, and local and county District Attorneys' offices.

Paul has more than 20 years' experience at the Texas Attorney General's office having spent the majority of his time in the Consumer Protection Division, including as Division Chief. Most recently, he served as part of the agency's Executive Leadership team as the Associate Deputy Attorney General for Civil Litigation, where he oversaw all plaintiff-oriented civil litigation for the Attorney General, including Consumer Protection, Environmental Protection, Antitrust, Civil Medicaid Fraud, and Bankruptcy and Collections. A significant portion of his practice has been leading major multistate investigations, many involving Fortune 500 companies, where he repeatedly led 50 states plus the District of Columbia to settlement, many in the multi-million dollar range.

In recognition of his contributions to the National Association of Attorneys General (NAAG) and the nation's Attorneys General through exemplary expertise and achievement, Paul received the Career Staff of the Year Award from NAAG in 2020. In 2018, he received the Faculty of the Year Award from the National Attorneys General Training and Research Institute (NAGTRI) for his significant contributions to developing and presenting high quality legal training programs for his counterparts across the country. He received his J.D. from the University of Texas School of Law, and his B.A. with high honors from the University of Texas at Austin.

Also joining Kelley Drye from the Texas Attorney General's office is senior associate Beth Bolen Chun. Ms. Chun comes from the Consumer Protection Division of the Office of the Attorney General of Texas, where she investigated and prosecuted violations of the Texas Deceptive Trade Practices Act and other state and federal consumer protection laws.

Kelley Drye has a national reputation as counsel for corporate defendants in federal and state investigations and litigation. The firm offers a unique combination of substantive subject matter expertise and relationships at every level, from staff attorney to State Attorneys General and FTC Commissioners.

Paul Singer of the Office of the Texas Attorney General Joins Kelley Drye, Bolstering State AG Practice

Subscribe here to Kelley Drye’s Ad Law News and Views newsletter to see another side of Jessica, Laura and others in our second annual Back to School issue. Subscribe to our Ad Law Access blog here.

]]>
A New Era for the FTC and U.S. Privacy? House Reconciliation Bill Would Give the FTC $500 Million to Build a New Privacy Bureau. https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/a-new-era-for-the-ftc-and-u-s-privacy-house-reconciliation-bill-would-give-the-ftc-500-million-to-build-a-new-privacy-bureau https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/a-new-era-for-the-ftc-and-u-s-privacy-house-reconciliation-bill-would-give-the-ftc-500-million-to-build-a-new-privacy-bureau Tue, 02 Nov 2021 10:13:35 -0400 As we’ve all been following in the news, the House reconciliation bill to fund “human infrastructure” is still mired in negotiations, ever on the verge of either passing to monumental fanfare, or cratering in failure. Tucked away on page 671 of the 1684-page bill is a short provision that, despite scant attention, has the potential to usher in a new era for the FTC and U.S. privacy – $500 million to fund a brand new FTC privacy bureau, to be spent between 2022 and 2029. Here’s what the provision says:

FEDERAL TRADE COMMISSION FUNDING FOR A PRIVACY BUREAU AND RELATED EXPENSES. In addition to amounts otherwise available, there is appropriated for fiscal year 2022, out of any money in the Treasury not otherwise appropriated, $500,000,000, to remain available until September 30, 2029, to the Federal Trade Commission to create and operate a bureau, including by hiring and retaining technologists, user experience designers, and other experts as the Commission considers appropriate, to accomplish its work related to unfair or deceptive acts or practices relating to privacy, data security, identity theft, data abuses, and related matters.

With all of the talk about trillions here and billions there for infrastructure, $500 million might sound like chump change. But to put it in perspective, the FTC’s current annual budget is about $350 million – covering all of its programs, including its entire antitrust mission and the many components of consumer protection, of which privacy is just one. Further, the FTC currently employs just 61 people to staff its privacy mission, at a cost of about $13 million – less than 1/38th of the proposed new $500 million budget (or about 1/5 on an annual basis, when the amount is spread over its eight-year duration). Indeed, when one of us called for the creation of a new FTC privacy bureau last March, it seemed inconceivable that such a bureau could launch with a half-billion-dollar wind at its sails.

With $500 million, the FTC could employ hundreds of additional staff – including lawyers, investigators, technologists, and other experts – to oversee U.S. privacy. It could bring (and litigate) “big cases” (and smaller ones too), study key industries, conduct consumer surveys, provide more personalized assistance to U.S. consumers, and provide greater leadership and guidance (through public events and user-friendly publications) here and abroad. (And yes, the FTC could also launch rulemakings to expand existing rules, or to launch new ones under its inherent Magnuson-Moss rulemaking authority.) For those who have been calling for a brand new U.S. privacy agency, this new, well-funded privacy bureau could go a long way to satisfying their goals.

Of course, what this legislation would not do is strengthen the FTC’s legal authority by (among other things) enacting a comprehensive federal privacy law, giving the FTC full jurisdiction over common carriers and nonprofits, and authorizing monetary remedies for privacy violations. For years, the FTC and others have argued that these types of legal reforms are necessary to ensure that the agency can be fully effective in protecting consumers’ privacy.

Nevertheless, if the legislation passes, businesses should expect more oversight in the form of investigations, litigation, warning letters, studies, surveys, and rulemakings. Indeed, the FTC has already highlighted some of these goals and aspirations in early October, which we discussed in an October 3 blogpost on the topic. With $500 million, the FTC’s “wish list” list would grow exponentially longer and could include, for example, enforcement “sweeps” to examine companies’ privacy practices, even in the absence of any suspicion of wrongdoing. We are following this issue closely and will post additional details as they unfold.

]]>
GLBA Safeguards Gets a Makeover: Why it Matters for Businesses with Customer Information https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/glba-safeguards-gets-a-makeover-why-it-matters-for-businesses-with-customer-information https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/glba-safeguards-gets-a-makeover-why-it-matters-for-businesses-with-customer-information Mon, 01 Nov 2021 23:59:53 -0400 In a much-anticipated announcement last week, the FTC amended the Gramm-Leach-Bliley Act (GLBA) Safeguards Rule, and proposed a further amendment requiring certain financial institutions to provide the FTC with notice in the event of certain security events. Although these changes were announced after FTC Commissioner Chopra left the agency to lead the CFPB, he apparently voted prior to leaving to ensure 3/2 approval of the amendments in a Commission that remains divided.

What is GLBA Safeguards?

For nearly 20 years the Safeguards Rule has required financial institutions to develop, implement, and maintain comprehensive information security programs to protect their customers’ personal information. Such programs must be appropriate to each entity’s “size and complexity, the nature and scope of [its] activities, and the sensitive of the customer information at issue.” For a generation, the Rule’s requirements have influenced data security standards in other sectors, emphasizing a flexible, process-based approach. The amended Rule replaces some of that flexibility with more specificity.

Violations of the Safeguards Rule are enforced through the FTC Act. Unlike other FTC rule violations, Safeguards violations are not subject to civil penalties.

The definition of “financial institutions” is broad. It includes businesses that are “significantly engaged” in financial activities or significantly engaged in activities incidental to financial activities, which include businesses as diverse as real estate settlement services, credit counseling services, and professional tax preparers. The amendments expand the definition of financial institutions to expressly include “finders,” which are companies that bring together buyers and sellers of a product or service for personal, family, or household purposes, in a manner that is incidental to financial activities, such as a lead generator that helps consumers find a financial institution for a home mortgage or car insurance.

FTC’s Changes to the Rule’s Requirements

The FTC’s amended Rule, which takes effect one year from its publication in the Federal Register, makes significant substantive changes in four general areas, which largely follow the agency’s 2019 Notice of Proposed Rulemaking, which we summarized here in 2019:

(1) First, it adds more specific requirements related to the design and implementation of a safeguards program, including:

  • Access Controls. Periodic review of access controls, including “technical and physical controls” to limit access only to authorized users and only to necessary customer information.
  • Data and Systems Inventory. Inventory of the data in the financial institution’s possession, the systems on which (and facilities where) that data is collected, stored, or transmitted, and an understanding of the relevant portions of applicable systems and their importance.
  • Encryption of all customer information in transit and at rest. If encryption is not feasible, implementation of comparable controls with the approval of the “Qualified Individual” (see below description of this role).
  • Multi-Factor Authentication. Implementation of MFA to access any information system, or equivalent or stronger controls.
  • Secure Applications. Adoption of secure development practices for applications developed in-house and assessment of externally developed applications.
  • Change Management. Adoption of procedures governing changes to a company’s safeguards.
  • Intrusion Detection. If a company is unable to implement “effective continuous monitoring . . . to detect…changes in information systems that may create vulnerabilities,” it must perform annual penetration testing and twice-yearly vulnerability assessments.
  • Incident Response. Development of a written incident response plan that meets specific criteria.

(2) Second, it adds provisions requiring appointment of a single “Qualified Individual” to oversee the program and report to the board of directors or equivalent governing body.

(3) Third, it exempts from some of the Rule’s requirements entities that collect information from fewer than 5000 customers.

(4) Finally, the Rule sets forth many terms and related examples in the Rule itself rather than incorporating them by reference.

Commissioners Phillips and Wilson dissented from approval of the final rule, expressing concern that the new requirements could weaken data security by “diverting finite resources towards a check-the-box compliance exercise and away from” tailored risk management. They explained that the record did not support the Rule revisions.

FTC Proposes New Notification Requirement

Simultaneously with its release of the above amendments, the FTC proposed a further amendment to require financial institutions to report certain security events to the FTC within 30 days of discovery. In its Supplemental Notice of Proposed Rulemaking, the FTC recommended that when a financial institution determines that misuse of customer information has occurred or is reasonably likely, and at least 1,000 consumers have been or may reasonably be affected, the financial institution must notify the agency. The FTC reasoned that this proposed requirement would impose “little additional burden” while facilitating the agency’s enforcement of the underlying rule. The FTC anticipates making the reported information public through a database it would “update periodically.” The proposed notification requirement would not allow for any delay of the 30-day reporting obligation for financial institutions cooperating with law enforcement.

The FTC seeks comments on its proposed notification requirement, including on the following:

  • Whether the information proposed to be submitted to the Commission is over- or under-inclusive.
  • Whether the FTC calibrated notice correctly. Should it account for events in which misuse is not likely (e.g., encrypted data)? Should it only require notice when a company must provide notice to a governmental entity under another state or federal requirement?
  • The timing of the notice period – i.e., whether a shorter period is practicable?
  • Whether the requirement should allow law enforcement agencies to prevent or delay notification? And relatedly, whether information reported to the Commission should be kept confidential in some circumstances?
  • The extent to which the Commission should require notification to consumers.

The deadline to submit written comments is 60 days after the notice is published in the Federal Register.

Why Do the Changes and the FTC’s Proposal Matter?

Although the Safeguards Rule is limited to financial institutions, the granularity of the controls announced are likely to influence financial institutions’ contract terms. On a practical level, the new Safeguards Rule may serve as a model for data security standards in other sectors; the proposed notification requirement may set the stage for additional burdens on financial institutions and other firms; and the Rule changes could affect the scope of state privacy laws, some of which exempt data covered by the Gramm-Leach-Bliley Act from their requirements. Further, while the Safeguards Rule does not include civil penalties, businesses should take note of the various efforts underway by the agency to create a basis to assert civil penalties under a variety of scenarios, as we’ve discussed, for example, here (notice to companies about earnings and endorsement claims), here (what notice about penalty offenses means), and here (GLBA pretexting).

* * *

If you have questions about these changes, or would like to discuss submitting comments on the proposed notification requirement, please contact us. As with all of these fast-moving developments, we are closely monitoring and will post updates as they occur.

]]>
Cracks in the Privacy Wall Between Kids and Teens: Is Teen Privacy Legislation on the Horizon? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/since-congress-enacted-coppa-the-regulatory-wall-between-kids-and-teens-has-been-a-remarkably-durable-one https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/since-congress-enacted-coppa-the-regulatory-wall-between-kids-and-teens-has-been-a-remarkably-durable-one Wed, 13 Oct 2021 09:20:44 -0400 Since Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998, the regulatory wall between kids and teens has been a remarkably durable one. During all this time, COPPA, the primary U.S. law protecting kids’ privacy, has protected children under 13 but hasn’t provided any protections for teens. While California’s privacy law grants some rights to teens under 16, these protections are narrow (opt-in for data sharing) and only apply within that state. This means that teens are generally treated like adults for purposes of privacy in the U.S.

It’s not exactly clear why COPPA’s age 13 cut-off was chosen the first place. First year of teen-hood? Bar Mitzvah age? The age when children become too independent and tech-savvy to let their parents control their media? (Ahem – that happened at age six in my house.) Whatever the reasons for the original choice, age 13 has stuck, even as concerns about teens’ privacy and use of social media have grown, and Senator Markey and others have repeatedly proposed extending privacy protections to teens.

However, we might finally be seeing some cracks in the kid-teen privacy wall – cracks that could lead to a federal law protecting teens in the not-too-distant future.

These cracks are due to a confluence of events. Notably, in September 2020, the U.K. passed a law (the Age Appropriate Design Code or AADC) that requires all online commercial services “likely to be accessed by” kids and teens (including apps, programs, websites, games, community environments, and connected toys or devices) to meet 15 standards to ensure that their content is age appropriate. The law, which became fully effective in September 2021, starts with the principle that any service be designed with the “best interest of the child” as a primary consideration. It then details more specific requirements, including that defaults be set at the most protective level (e.g., location tracking and profiling are set to “off”), that data is not shared with third parties without a “compelling reason,” and that “nudge” techniques aren’t used to encourage minors to provide data or reduce their protections.

In response to the law, U.S. companies operating in the U.K. (notably, some of the large tech platforms) recently announced new protections for teens – a significant development in the long-running kid-teen debate, but one that has received relatively little attention. For example, Facebook/Instagram now says that for kids under 16, it will default them into private accounts; make it harder for “suspicious” accountholders to find them; and limit the data advertisers can get about them. Meanwhile, Google/YouTube has pledged similar protections for kids under 18, including private accounts by default; allowing minors to remove their images; applying restrictive default settings; turning off location history permanently; and limiting the data collected for ad targeting.

Following these announcements, Senator Markey and two House members sent a letter to the FTC urging it to ensure that these companies keep their promises, using its authority to stop deceptive practices under the FTC Act.

And there’s more. Last week, in developments widely covered in the media, a former Facebook employee detailed what she viewed as manipulation of teens using algorithms that kept them on the platform and exposed them to harmful content. Also, with broad-based privacy legislation perennially stalled, there’s been talk that Congress might prefer to tackle privacy issues that are more manageable and bipartisan (like kids’ and teen privacy) – talk that has only grown louder since the developments regarding Facebook.

Adding to the momentum, Senator Markey recently introduced a bipartisan bill (with Republican Senator Cassidy) that would provide privacy protections specific to teens, and Representative Castor has introduced a similar bill in the House. Further, the FTC has expressed a strong interest in protecting kids’ privacy, and in undertaking enforcement and rulemakings to extend U.S. privacy protections beyond the status quo.

In short, the kid-teen privacy wall is under pressure, and we could soon see a federal law, FTC enforcement, and/or (a harder climb) an FTC rulemaking using the agency’s Magnuson-Moss authority. For companies that collect teen data in connection with marketing or providing commercial products or services, this means double-checking your data practices to ensure that they’re age-appropriate and don’t expose teens to harms that can be avoided. (While the U.K.’s AADC principles are very ambitious, and do not apply to U.S.-only companies, they’re a valuable reference point.) It also means being prepared to explain and defend your data practices with respect to teens if regulators come knocking.

We will continue to monitor developments on this issue and provide updates as they occur.

]]>
Hope Emerges at Senate Data Security Hearing – But Will Congress Grab the Brass Ring? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/hope-emerges-at-senate-data-security-hearing-but-will-congress-grab-the-brass-ring https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/hope-emerges-at-senate-data-security-hearing-but-will-congress-grab-the-brass-ring Sun, 10 Oct 2021 10:25:23 -0400 On October 6, 2021, the Senate Commerce Committee conducted its second in a series of hearings dedicated to consumer privacy and data, this time addressing Data Security. Similar to last week’s privacy hearing, the witnesses and Senators appeared to agree that federal data security standards – whether as part of privacy legislation or on their own – are urgently needed. If there were to be consensus around legislative principles, the hearing provides clues about what a compromise might look like.

Prepared Statements. In their opening statements, the witnesses emphasized the need for minimum standards governing data security.

  • James E. Lee, Chief Operating Officer of the Identity Theft Resource Center, explained that without minimum requirements, companies lack sufficient incentives to strengthen their data security practices to protect consumer data. Lee also advocated for more aggressive federal enforcement rather than the patchwork of state actions, which, he said, produce disparate impacts for the same conduct.
  • Jessica Rich, former Director of the FTC’s Bureau of Consumer Protection and counsel at Kelley Drye, emphasized that current laws do not establish clear standards for data security and accountability. She advocated for a process-based approach to prevent the law from being outpaced by evolving technologies and to ensure that it accommodates the wide range of business models and data practices across the economy. Among her recommendations, Rich suggested that Congress provide the FTC with jurisdiction over nonprofits and common carriers and authority to seek penalties for first-time violations.
  • Edward W. Felten, former Deputy U.S. Chief Technology Officer, former Chief Technologist of the FTC’s Bureau of Consumer Protection, and current Professor of Computer Science and Public Affairs at Princeton University, focused on the need to strengthen the FTC’s technological capabilities, including increasing the budget to hire more technologists. Notably, Felten advocated for more prescriptive requirements in data security legislation such as requiring companies to store and transmit sensitive consumer data in encrypted form and prohibiting companies from knowingly shipping devices with serious security vulnerabilities.
  • Kate Tummarello, Executive Director at Engine, a non-profit organization representing startups, addressed the importance of data security for most startups. Tummarello advocated for FTC standards or guidance with flexible options. Cautioning against overburdening startups, Tummarello explained that newer companies take data security seriously because they do not have the name recognition or relationships with consumers that larger companies may have, and a single breach could be extremely disruptive. Additionally, Tummarello highlighted that the patchwork of state laws provides inconsistent and unclear data security guidance and imposes high compliance costs.

Discussing a Federal Data Security Bill

  • Preemption. Witnesses agreed that a preemptive federal law does not necessarily mean a weaker law. Rich offered a middle ground, supporting preemption, but stating the law should fully empower the state AGs to enforce it.
  • Private Right of Action. Tummarello expressed concern that lawsuits across the country would contribute to the “patchwork” of laws that increase compliance costs. However, if a private right of action were necessary, she would support only a narrow private right of action with sufficient notice and guardrails, particularly to protect startups vulnerable to bad faith litigation. Lee demurred on whether a private right of action was needed but emphasized that consumers need to be protected no matter what state they live in. Rich stated that if the legislation is strong enough – with robust protections and remedies, full enforcement authority for the states, and significant resources for the FTC – it will protect consumers without the need for a private right of action. However, Rich also described “middle grounds” that could bridge the divide.
  • Sensitive Data. Although there were some questions about what constitutes sensitive data, the witnesses agreed that both biometric data and data about children should have heightened protections. Felten addressed concerns regarding artificial intelligence and facial recognition. Lee discussed the importance of protecting biometric data because it is permanent and cannot be changed – unlike a credit card number – if it is compromised.
  • Process-Based Approach. Rich emphasized the need for a “scalable” federal law that takes a process-based approach so that it does not quickly become obsolete. She added that the FTC could issue more detailed guidance on a regular basis to highlight particular technologies and safeguards that companies should consider. In contrast, Felten supported specific safeguards that the FTC would require through rulemaking, and Tummarello supported an FTC rule or guidance that would give companies a “menu” of safeguards to consider.
  • Inclusion with Data Privacy Bill. All witnesses supported including data security provisions into a federal privacy bill, but Rich stated that a data security law could prevent considerable consumer harm as a stand-alone measure.

FTC’s Role and Enforcement.

  • FTC as Enforcer. Similar to last week’s hearing, all witnesses agreed that the FTC was the agency best equipped to oversee and enforce a federal data security law.
  • Resources Needed. Felten noted that the FTC only has about ten technologists on staff, but could use 50-60 people in technologist roles to supplement its enforcement efforts. Rich added that technologists need a career path at the FTC, and that the FTC should reexamine the complicated ethics rules governing what technologists may do after they leave the FTC’s employment.
  • First time penalties. All witnesses agreed that the FTC should be able to seek penalties for first-time violations. Tummarello, however, said that she supports first-time penalties only if there are clear rules of the road.

Overall, the hearing made clear that there are more areas of agreement than disagreement. The key questions are: (1) Can Congress resolve differences related to a private right of action, whether by ensuring strong protections without it or by compromising on a narrow private right of action? (2) Will Congress be willing to pass federal data security legislation on its own? We will continue to monitor developments on this issue and provide updates as they occur.

Hope Emerges at Senate Data Security Hearing –But Will Congress Grab the Brass Ring?
]]>
“Not Outgunned, Just Outmanned” (For Now): Senate Hearing on Privacy Law Addresses Under-resourced FTC https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/not-outgunned-just-outmanned-for-now-senate-hearing-on-privacy-law-addresses-under-resourced-ftc Fri, 01 Oct 2021 08:45:07 -0400 On September 29, 2021, the Senate Commerce Subcommittee held a hearing titled Protecting Consumer Privacy. The senators addressed the potential $1 billion earmarked to strengthen the FTC’s privacy work, the future of a federal privacy and data protection law, and a myriad of other privacy related topics such as children’s privacy.

Prepared Statements. In their opening testimonies, the witnesses emphasized different types of needs for the FTC.

  • David Vladeck, a former Director of the FTC Bureau of Consumer Protection, strongly advocated for a federal privacy law and additional funding for the FTC to support increased efforts on technology-centered consumer protection enforcement. In his remarks, Vladeck noted that the FTC has been wholly understaffed and underfunded for forty years, despite the agency’s ever increasing responsibilities and the complexity of issues it now faces. Additionally, Vladeck emphasized the need to increase the FTC’s enforcement powers by giving the FTC rulemaking authority under the APA and civil penalty authority.
  • Morgan Reed, the president of The App Association, focused more on the need for a federal privacy law to reduce the compliance costs for small businesses. He reiterated that the patchwork of state laws increases risk and costs for small businesses.
  • Maureen Olhausen, a former Acting FTC Chairman and Commissioner, shifted the conversation from funding for the FTC to the importance of a federal privacy law. She noted that “the FTC lacks explicit authority to enforce statutory privacy requirements or promulgate privacy regulations,” and that a federal privacy law should address this gap, allowing for enforcement, along with state attorneys general.
  • Ashkan Soltani, a former FTC Chief Technologist, primarily concentrated on the urgent need for expertise at the FTC. He emphasized the importance of hiring technologists and experts, but also paying them competitive rates to retain talent. The FTC is understaffed to handle litigation matters or to monitor compliance with consent orders, particularly those that require technical fluency.

Discussing the Federal Privacy Bill. The senators appeared to be in consensus that there is a need for a federal privacy law. Senator Wicker called on the Biden Administration to provide a liaison to Congress to prioritize the enactment of a law.

  • Right to Cure. Reed was adamant that a right to cure provision be written into the bill to protect small businesses from being punitively fined for unintentional mistakes such as not responding to an email within 30 days.
  • Private Right of Action. The witnesses went back and forth on the correct approach to a private right of action. While Soltani supported a private right of action as a means to “make up for the concern that there’s not enough enforcement capacity,” Olhausen was concerned that the private right of action would not result in consumer redress, but rather attorney’s fees. Reed stated that he preferred injunctive relief as a type of private right of action. Similarly, Soltani noted that in his experience, core behavior changes come not from fines, but injunctions and restrictions imposed on the business.
  • Preemption. Vladeck, Reed, and Olhausen supported federal preemption. Soltani agreed that a federal privacy law should only be a floor, and not a ceiling. In other words, a federal privacy law should preempt less rigorous laws to set a baseline standard, but states could enact additional measures and add further protections for their constituents.
  • Carve-out. The witnesses went back and forth on whether size of business should factor into whether an entity would be covered by the bill. Vladeck emphasized that small businesses can create big harms; therefore, the legislation needs to be focused on consumer harm rather than the size of the company. Reed agreed, but reiterated the need for a right to cure for small businesses.

Funding for the FTC. Senators focused on whether the FTC needs $1 billion to achieve its goal of protecting consumers. Vladeck wholeheartedly agreed and said that an additional $100 million a year would be a good start for the FTC. For example, on the recent Google litigation, Vladeck theorized that Google had 1,000 privacy attorneys, whereas the FTC had less than 100. Vladeck noted that the funding would be earmarked for hiring more attorneys, engineers, and technologists, as well as setting up a new bureau of privacy.

Children’s Privacy. The witnesses received several questions on their thoughts on protecting children’s privacy in the aftermath of reports on how social media impacts children’s mental health. Vladeck specifically advocated for lowering the scienter standard that the FTC has to prove to show that a developer knew their technology was tracking children. This mirrors the EU’s “constructive knowledge” standard that is used for children’s privacy. Additionally, Vladeck suggested getting rid of COPPA’s safe harbor program and rethinking the age limit. All witnesses agreed that children were vulnerable to targeted ads. In response to Senator Markey’s concern for children’s privacy, all witnesses responded that they would approve of another children’s privacy bill if Congress could not enact a sweeping data protection and privacy law for adults.

]]>
Health and Fitness Apps and IoT Technologies Should Take Note: FTC Expands Interpretation of its Health Breach Notification Rule https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-and-fitness-apps-and-iot-technologies-should-take-note-ftc-expands-interpretation-of-its-health-breach-notification-rule https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/health-and-fitness-apps-and-iot-technologies-should-take-note-ftc-expands-interpretation-of-its-health-breach-notification-rule Fri, 17 Sep 2021 14:42:29 -0400 In an aggressive expansion of its security and privacy enforcement programs, on September 15, 2021, the FTC issued what it characterized as a “Policy Statement” reinterpreting an old rule about personal health records.

First, some background. In 2009, Congress directed the FTC to create a rule requiring companies to provide notice when there is an unauthorized acquisition of certain health information not covered by HIPAA. At the time, the FTC explained that its Health Breach Notification Rule was narrow, consistent with the text of the law, applying only to security breaches by vendors of certain health data repositories (called “personal health records” or “PHRs”) and certain companies that work with PHR vendors.

Flash forward to September 2021. The FTC’s Policy Statement declares a broad range of health, fitness, wellness, and related technologies to be covered by the Rule if they can draw information from “consumer inputs” and APIs that include “personal health records.” This scope is markedly broader than the agency’s previously-issued guidance, which reiterated the narrow application of the Rule. To further illustrate, the FTC now says that health apps, such as glucose monitors or fitness trackers, are subject to the Rule if they draw information from a device or wearable and a phone calendar. In an unprecedented, expansive application of a narrow breach notice rule to consumer privacy, presumably to address what Chair Khan characterizes as “surveillance-based advertising,” the Statement also asserts that the “sharing of covered information without an individual’s authorization” triggers breach notification obligations. The FTC issued this policy statement even as the Commission was in the midst of seeking public comment on the rule as part of its periodic rule review process.

Companies violating the Rule face civil penalties of $43,792 per violation.

Commissioners Wilson and Phillips issued strong dissents, calling the Commission majority to task for abandoning prior business guidance and ignoring the Administrative Procedure Act’s notice and comment requirements. FTC Chair Khan, in turn, lamented the fact that the Commission had not brought an enforcement action under the Rule, cautioning that “the Commission should not hesitate to seek significant penalties against developers of health apps and other technologies that ignore [the Rule’s] requirements.”

App developers and other companies providing health, wellness, fitness, and related apps should consider the implications of the FTC’s Statement, and assess the potential applicability to their business, even if they do not normally view themselves as covered by HIPAA or operating in an adjacent space. Indeed, the FTC’s Policy Statement underscored that its guidance was intended to sweep broadly, noting its relevance for apps and other technologies that “track diseases, diagnoses, treatment, medications, fitness, fertility, sleep, mental health, diet, and other vital areas.” Unfortunately, the Policy Statement raises more questions than it answers. For example:

  • Is all personal information collected by such technologies subject to the FTC’s new interpretation of the Health Breach Notification Rule?
  • Do current data governance policies and practices provide appropriate safeguards?
  • Are existing consumer disclosures and consents adequate to mitigate risk? For example, what level of “authorization” would be required for sharing personal information for interest-based advertising and analytics purposes?

* * *

We will closely monitor developments and post updates as they occur.

]]>
Jessica Rich and Laura Riposo VanDruff, Two Former Senior FTC Officials, Join Kelley Drye’s Privacy and Advertising Practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/jessica-l-rich-and-laura-riposo-vandruff-two-former-senior-ftc-officials-further-bolstering-kelley-dryes-privacy-and-advertising-practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/jessica-l-rich-and-laura-riposo-vandruff-two-former-senior-ftc-officials-further-bolstering-kelley-dryes-privacy-and-advertising-practices Wed, 08 Sep 2021 13:33:07 -0400 Jessica L. Rich and Laura Riposo VanDruff, Two Former Senior FTC Officials Further Bolstering Kelley Drye’s Privacy and Advertising PracticesWe are thrilled that Jessica Rich and Laura Riposo VanDruff have joined the firm’s Privacy and Advertising practice groups. Both attorneys are former top officials at the Federal Trade Commission (FTC), with Rich having served as Director of the Bureau of Consumer Protection (BCP) and VanDruff as an Assistant Director in BCP’s Division of Privacy and Identity Protection (DPIP).

Jessica and Laura join our impressive list of former FTC officials, including the firm’s managing partner, Dana Rosenfeld, who served as Assistant Director of BCP and attorney advisor to FTC Chairman Robert Pitofsky, former Bureau Directors Bill MacLeod and Jodie Bernstein, as well as Aaron Burstein, having served as senior legal advisor to FTC Commissioner Julie Brill.

Jessica served at the FTC for 26 years and led major initiatives on privacy, data security, and financial consumer protection. She is credited with expanding the FTC’s expertise in technology and was the driver behind FTC policy reports relating to mobile apps, data brokers and Big Data, the Internet of Things, and federal privacy legislation. She also directed the agency’s development of significant privacy rules, including the Children’s Online Privacy Protection Rule and Gramm-Leach-Bliley Safeguards Rule. She is a recipient of the FTC Chairman’s Award, the agency’s highest award for meritorious service and the first-ever recipient of the Future of Privacy Forum’s Leadership Award. Jessica is also a fellow at Georgetown University’s Institute for Technology Law & Policy. Prior to joining Georgetown, she was an Independent Consultant with Privacy for America, a business coalition focused on developing a framework for federal privacy legislation.

Laura also brings significant experience to Kelley Drye. As Assistant Director for the FTC’s Division of Privacy & Identity Protection, Laura led the investigation and prosecution of matters relating to consumer privacy, credit reporting, identity theft, and information security. Her work included investigation initiation, pre-trial resolution, trial preparation, and trial practice relating to unreasonable software security, mobile operating system security update practices, and many other information privacy and identity protection issues. She joins the firm from AT&T where she served as an Assistant Vice President – Senior Legal Counsel advising business clients on consumer protection risks, developing and executing strategies in response to regulatory inquiries, and participating in policy initiatives within the company and across industry.

Jessica and Laura are an impressive duo and are sure to be an asset to our clients as they prepare for the future of privacy and evolving consumer protection law.

* * *

Subscribe here to Kelley Drye’s Ad Law News and Views newsletter to see another side of Jessica, Laura and others in our second annual Back to School issue. Subscribe to our Ad Law Access blog here.

]]>
CPRA Update: How to Prepare for Privacy Compliance as an Employer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cpra-update-how-to-prepare-for-privacy-compliance-as-an-employer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/cpra-update-how-to-prepare-for-privacy-compliance-as-an-employer Sun, 20 Jun 2021 08:10:53 -0400 Last year’s voter guide to California Proposition 24, the California Privacy Rights Act (CPRA), included a stark argument against enacting the privacy ballot initiative because it did not go far enough to protect employee privacy. “Currently, employers can obtain all kinds of personal information about their workers and even job applicants,” the argument against Proposition 24 written by Californians for Privacy Now stated. “Proposition 24 allows employers to continue secretly gathering this information for more years to come…”

The message did not stick. Voters overwhelmingly enacted the CPRA, apparently judging that its provisions – including those that apply to employers – were worth an additional two-year waiting period. The effective date of the new law is January 1, 2023.

As companies build their roadmap to CPRA compliance, that assessment should also take into account planning for employee and job applicant privacy changes. The new law imposes first in the nation obligations that grant employees and job applicants new rights to access, correct, delete, and opt out of the sale or sharing of their personal information. The law also prohibits discriminating against employees or job applicants who lodge privacy rights requests.

In this post, we provide an overview of topics that employers should know as the sunset of the employer exception to CCPA approaches.

Why Would CCPA Apply to Employers?

The California Consumer Privacy Act of 2018 (CCPA), which became effective on January 1, 2020, originally applied to employers. The law defines a “consumer” as a natural person who is a California resident. This includes employees, job applicants, contractors, or other staff of a business.

In 2019, the California legislature amended the CCPA with a stopgap measure – for one year, the CCPA would not apply to employers. The measure, AB 25, said that personal information collected by a business in the course of the person acting as an employee, job applicant, or contractor in connection with the consumer’s employee, job applicant, or contractor role is exempt from the CCPA. Also exempt is emergency contact information or information necessary to administer benefits.

Last year, California voters extended the employer exemption for another two years to January 1, 2023 in the CPRA ballot initiative.

What Employers are Covered by California Privacy Law?

If a business is covered by the CCPA for consumer data, it is covered for employee data. Starting in January 2023, the CPRA thresholds for coverage are as follows:

  • Annual gross revenues in excess of $25 million in the preceding calendar year,
  • Buys, sells, or share personal information of 100,000 or more California consumers or households, or
  • Derives 50 percent or more of its annual revenues from selling or sharing California consumers’ personal information.
Some employers may be eligible for certain exemptions that are applicable to already-regulated information that they hold about their employees. For example, credit information that employers routinely collect to assess employment eligibility may be subject to an exception, because the information is already covered under federal fair credit reporting laws.

Also, employers that have existing obligations as business associates under the Health Insurance Portability and Accountability Act (HIPAA) may also be exempt with respect to any medical, protected health information (PHI), or covered benefits information that they maintain, use, or disclose.

In general, employers are also not required to comply with CPRA obligations that conflict with other federal, state, or local laws or legal obligations, or restrict an employer’s ability to exercise or defend legal claims. For example, affirmative legal obligations to gather and maintain certain information, such as EEO-1 reports or compensation-related information may directly conflict with CPRA.

What Constitutes Employee Personal Information?

The definition of employee “personal information” includes information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular employee.

This may include name, contact information, identifiers, protected classifications (like gender, race, or sexual orientation), financial or medical information, account log in, religious or philosophical beliefs, union membership, commercial information, biometric information, internet or electronic network activity information, geolocation data, audio, electronic, visual, thermal, olfactory, or similar information, professional or employment-related information, education information, and inferences drawn from any of this information about the employee.

The contents of an employee’s mail, email, and text messages constitutes sensitive personal information, a sub-category of personal information, unless the employer is the intended recipient of the communication.

What Obligations Apply Starting in January 2023?

All CPRA obligations apply. These include:

  • Notice: Employees will be required to provide a comprehensive notice of their collection of personal information from employees, job applicants, and contractors, including description of the categories of personal information collected, the purposes of collection, details on disclosure of personal information, and information about retention of personal information.
  • Right to access: Provide employees with a right to access categories of personal information and specific pieces of personal information. This includes any inferences drawn from personal information to create a profile reflecting the employee’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
  • Right to correct: Provide employees with the right to correct their personal information using commercially reasonable efforts.
  • Right to delete: Provide employees the right to delete their personal information. However, numerous statutory exemptions may apply, including allowing an employer to retain personal information reasonably anticipated by the employee within the context of an ongoing relationship with the employer, to perform a contract between the employee and employer, or to comply with a legal obligation.
  • Right to restrict uses of sensitive personal information: Sensitive personal information includes a social security number, account log in, financial information, geolocation, racial or ethnic origin, religious beliefs, sexual orientation, health information, biometrics, and the contents of employee communications unless the employer is the intended recipient of the communication. Starting in January 2023, an employee may be able to direct an employer to limit certain uses of sensitive personal information for specific business purposes, as well as to direct an employer to limit disclosure of sensitive personal information, absent a qualifying exemption.
  • Right to opt out: Provide employees the right to opt out of the sale of personal information to third parties. The term “sale” is a broad term, and includes disclosing employee information to business partners, vendors, and contractors absent a written agreement containing specific terms restricting the third party’s use of that data, or a qualifying exemption.
Certain obligations are subject to change depending on action expected in the coming year from the newly constituted California Privacy Protection Agency.

What Steps Should Employers Take to Prepare?

Given the complexity of HR data and systems, as well as the sensitivity of employee data generally, it is not too early for employers to prepare for CPRA. Such efforts might include, for example:

  • Privacy Stakeholders: Determine the legal, HR, and technology support (internal resources or external technology solutions) responsible for the efforts necessary to build a privacy compliance program and respond to privacy rights requests.
  • Data Mapping: Understand the information that the business collects, the categorization of data (whether personal information or sensitive personal information), the location of the data, and the steps to access, correct, or delete the data. A major part of this effort should also include determining which data practices identified are subject to applicable exemptions from CPRA.
  • Contract Review: Review partner contracts to correctly classify service providers and contractors from third parties, and that the contracts include the necessary restrictions depending on the classification. This effort might prioritize those partners that present more risk to the company, whether due to the nature of the processing, type, or volume of data in scope. Updating these contracts, however, might wait until there is more insight on the forthcoming CPRA regulations by the California Privacy Protection Agency (CalPPA) as to necessary terms, although the CCPA regulations are instructive.
  • Response Procedures: Develop procedures for responding to employee requests, including managing sensitive requests while maintaining personal information as confidential and accessible to internal personnel only on a need-to-know basis.
  • Retention Policy: Develop and document a retention policy that complies with applicable employer data retention obligations.
  • Notice: Draft an employee privacy policy that complies with new statutory obligations under CPRA, as well as forthcoming regulations by the CalPPA.
Do Any of These Obligations Apply Now?

Employers may have an obligation to provide a notice at or before collection of personal information that details the categories of personal information that they collect and the purposes for which personal information will be used.

However, due to an apparent drafting error in the CPRA ballot initiative, this privacy notice obligation is muddled by a textbook case of unclear statutory construction.

Here’s what happened. Originally, AB 25 required employers to provide a privacy notice to employees. However, the CPRA ballot initiative from last year changed a critical code section reference in an apparent drafting error. In so doing, the CPRA ballot initiative left unclear whether the employer privacy notice is required.

AB 25 said that employers would be required to provide a privacy notice based on Cal. Civ. Code 1798.100(b). The CPRA ballot initiative changed the reference to Cal. Civ. Code 1798.100(a). It is possible that the drafters intended to point to subsection (a) because in the CPRA ballot initiative this code section also requires a privacy notice. But the CPRA ballot initiative version of the code section is not actually the law until January 1, 2023.

That’s a problem because under current law (effective until December 31, 2022), Cal. Civ. Code 1798.100(a) talks about a different topic entirely – giving consumers the right to request that a business disclose the categories and specific pieces of personal information the business has collected about a consumer.

What is a reasonable interpretation in light of this problem? When it comes to statutory interpretation of ballot initiatives, courts generally say that the drafter’s intent does not matter. In California, usually a court first looks at the language of the statute. If the language is not ambiguous, the court presumes the voters intended the meaning apparent from the language. If the language is ambiguous, then courts usually look at the ballot initiative voter materials for clues on how voters made their decision.

It is easy to see why a court might agree that the language is ambiguous. The employer exception clearly does not provide a right of employees to access their personal information until January 1, 2023. Giving full effect to 1798.100(a) would be hampered by the fact that the CCPA’s core instructions on how to provide access to personal information and what to provide are subject to the employer exemption.

This brings us back to the ballot initiative materials provided to voters. The arguments against proposition 24 from Californians for Privacy Now warn that employers will be able to secretly gather personal information “for more years to come.” Clearly, there is no recognition in the ballot initiative materials of any interim employee rights.

Bottom line? The law right now is unclear, and so, as a practical matter, it’s a best practice (and required in a few other states) to publish a privacy notice for employees and job applicants.

Final Question: Do Employers Have Privacy Obligations in Other States?

There are no other states that have enacted CPRA-style comprehensive privacy laws that apply to employees; for example, Virginia and Colorado explicitly exempted the employment context without a sunset. But there are some states, such as Connecticut, that do require some form of privacy notice to employees. There are also two-party consent requirements in a number of states that are applicable to recording calls, as well laws that require disclosure about electronic monitoring.

Conclusion

The best way to address navigating these developments is to plan ahead with a compliance roadmap leading to 2023. Figure out what resources you’ll need, including what types of internal and external support will be critical for success. Given the complexities involved, thoughtful (and realistic) preparation is a must.

* * *

CPRA Update: How to Prepare for Privacy Compliance as an Employer

Subscribe here to Kelley Drye’s Ad Law Access blog and here for our Ad Law News and Views newsletter. Visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

]]>
Further Amendments to CCPA Regulations Are Approved and in Effect https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/californias-office-of-administrative-law-approved-further-revisions-to-the-attorney-generals-ccpa-regulations-on-march-15-2021 https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/californias-office-of-administrative-law-approved-further-revisions-to-the-attorney-generals-ccpa-regulations-on-march-15-2021 Wed, 17 Mar 2021 10:22:46 -0400 California’s Office of Administrative Law approved further revisions to the Attorney General’s CCPA regulations on March 15, 2021. The revisions went into effect upon approval. In substance, the revisions are identical to the fourth set of modifications the Attorney General proposed on December 10, 2020, and make the following changes: (1) Notice for Sale of PI Collected Offline: Businesses that sell personal information collected offline must provide an offline notice by means such as providing paper copies or posting signs in a store, or giving an oral notice if collecting personal information over the phone. (2) Opt-Out Icon: The revised regulations provide that businesses may use an opt-out icon in addition to, but not in lieu of, notice of a right to opt out or a “Do Not Sell My Personal Information” link. (3) Do Not Sell Requests: A “Do Not Sell” request must “be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.” The change prohibits businesses from using any method that is designed to or would have the effect of preventing a consumer from opting out. The revised regulation offers examples of prohibited opt-out practices, which include requiring a consumer to: (A) complete more steps to opt out than to re-opt in after a consumer had previously opted out; (B) provide personal information that is not necessary to implement the opt-out request; and (C) read through a list of reasons why he or she shouldn’t opt out before confirming the request. (4) Consumer Requests from Authorized Agents: A business may now require an authorized agent who submits a request to know or delete to provide proof that the consumer gave the agent signed permission to submit a request. The regulations also preserve the options business previously had of requiring the consumer to verify their identity directly to the business or directly confirming that they provided the authorized agent permission to submit the request. (5) Children’s Information: The addition of the word “or” in section 999.332 requires businesses that sell personal information of children under the age of 13 “and/or” between the ages of 13 and 15 to describe in their privacy policies how to make an opt-in to sale requests. We will continue to monitor closely further developments in CCPA regulations.California’s Office of Administrative Law approved further revisions to the Attorney General’s CCPA regulations on March 15, 2021. The revisions went into effect upon approval. In substance, the revisions are identical to the fourth set of modifications the Attorney General proposed on December 10, 2020, and make the following changes:

(1) Notice for Sale of PI Collected Offline: Businesses that sell personal information collected offline must provide an offline notice by means such as providing paper copies or posting signs in a store, or giving an oral notice if collecting personal information over the phone.

(2) Opt-Out Icon: The revised regulations provide that businesses may use an opt-out icon in addition to, but not in lieu of, notice of a right to opt out or a “Do Not Sell My Personal Information” link.

(3) Do Not Sell Requests: A “Do Not Sell” request must “be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.” The change prohibits businesses from using any method that is designed to or would have the effect of preventing a consumer from opting out. The revised regulation offers examples of prohibited opt-out practices, which include requiring a consumer to: (A) complete more steps to opt out than to re-opt in after a consumer had previously opted out; (B) provide personal information that is not necessary to implement the opt-out request; and (C) read through a list of reasons why he or she shouldn’t opt out before confirming the request.

(4) Consumer Requests from Authorized Agents: A business may now require an authorized agent who submits a request to know or delete to provide proof that the consumer gave the agent signed permission to submit a request. The regulations also preserve the options business previously had of requiring the consumer to verify their identity directly to the business or directly confirming that they provided the authorized agent permission to submit the request.

(5) Children’s Information: The addition of the word “or” in section 999.332 requires businesses that sell personal information of children under the age of 13 “and/or” between the ages of 13 and 15 to describe in their privacy policies how to make an opt-in to sale requests.

We will continue to monitor closely further developments in CCPA regulations.

https://www.adlawaccess.com/

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

]]>