Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Thu, 23 Jan 2025 08:14:30 -0500 60 hourly 1 Upcoming Price Gouging and Employee/HR Data Privacy Webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-price-gouging-and-employee-hr-data-privacy-webinars https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/upcoming-price-gouging-and-employee-hr-data-privacy-webinars Mon, 18 Jul 2022 14:53:02 -0400 How To Protect Employee/HR Data and Comply with Data Privacy Laws Wednesday, July 20

As workforces become increasingly mobile and remote work is more the norm, employers face the challenge of balancing the protection of their employees’ personal data and privacy against the need to collect and process personal data to recruit, support and monitor their workforces. Mounting regulations attempt to curb employers’ ability to gather and utilize employee data—from its historical use in processing employee benefits and leave requests to employers’ collection, use or retention of employees’ biometric data to ensure the security of the organization’s financial or other sensitive information systems. Learn what employers can do now to protect employee data and prepare for the growing wave of data privacy laws impacting the collection and use of employee personal data.

RSVP

Avoiding Price Gouging Claims Wednesday, August 3 Recently State Attorneys General, the House Judiciary Committee, and many others have weighed in on rising prices in an attempt to weed out price gouging and other forms of what they deem “corporate profiteering.” States and federal regulators are carefully looking at pricing as consumers and constituents become more sensitive to the latest changes and price gouging enforcement is an avenue states may be able to use to appease the public. Unlike other emergencies in the past, the current state of supply chain and labor shortages, along with skyrocketing costs for businesses, make it unrealistic for companies to simply put a freeze on any price increases. This webinar will cover:

• The basics of price gouging laws and related state emergency declarations and how to comply • The differences and varied complexities in state laws • General best practice tips • How AGs prioritize enforcement

Register

* * * *

Find more upcoming sessions, links to replays and more here

]]>
Day in the Life of a Chief Privacy Officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/day-in-the-life-of-a-chief-privacy-officer Thu, 17 Feb 2022 00:30:48 -0500 Day in the Life of a Chief Privacy OfficerOn this special episode, Privacy and Information Security practice chair Alysa Hutnik chats with Shana Gillers, TransUnion’s Chief Privacy Officer. Alysa and Shana discuss the journey to becoming a chief privacy officer, hot topics, and what it takes to stay on top of your game in privacy today.

Watch a video version here or the audio version here.

Shana Gillers

Shoshana Gillers has served as TransUnion’s Chief Privacy Officer since September 2019. In this role Ms. Gillers oversees compliance with privacy laws across TransUnion’s global footprint and promotes a culture of responsible data stewardship.

Prior to joining TransUnion, Ms. Gillers spent four years at JPMorgan Chase, ultimately serving as Vice President and Assistant General Counsel, Responsible Banking, Data and Privacy. Previously, she served as a federal prosecutor for eight years at the U.S. Attorney’s Office in Chicago, and as a litigator for four years at WilmerHale in New York. Ms. Gillers clerked for the Hon. Robert D. Sack on the U.S. Court of Appeals for the Second Circuit and for the Hon. Aharon Barak on the Supreme Court of Israel.

Ms. Gillers received a B.A. from Columbia University, summa cum laude, and a J.D. from Yale Law School.

Alysa Z. Hutnik

Alysa chairs Kelley Drye’s Privacy and Information Security practice and delivers comprehensive expertise in all areas of privacy, data security and advertising law. Her experience ranges from strategic consumer protection oriented due diligence and compliance counseling to defending clients in FTC and state attorneys general investigations and competitor disputes.

Prior to joining the firm, Alysa was a federal clerk for the Honorable Joseph R. Goodwin, United States District Judge, Southern District of West Virginia.

Alysa received a B.A. from Haverford College, and a J.D. from the University of Maryland Carey School of Law.

]]>
Privacy Law Update: Colorado Privacy Bill Becomes Law: How Does it Stack Up Against California and Virginia? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-law-update-colorado-privacy-bill-becomes-law-how-does-it-stack-up-against-california-and-virginia https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/privacy-law-update-colorado-privacy-bill-becomes-law-how-does-it-stack-up-against-california-and-virginia Thu, 08 Jul 2021 23:50:13 -0400 The Colorado Legislature recently passed the Colorado Privacy Act (“ColoPA”), joining Virginia and California as states with comprehensive privacy legislation. Colorado Governor Jared Polis signed the bill (SB 21-190) into law on July 7, and ColoPA will go into effect on July 1, 2023.

How does the measure stack up against the VCDPA and the CCPA (as amended by CPRA)? The good news is that, in broad terms, ColoPA generally does not impose significant new requirements that aren’t addressed under the CCPA or VCDPA, but there are a few distinctions to note..

  • Establishing consumer rights. As with the VCDPA and the CCPA, ColoPA provides rights for access, deletion, correction, portability, and opt out for targeted advertising, sales, and certain profiling decisions that have legal or similar effects. Unlike CCPA, which permits an authorized agent to submit any consumer requests, under ColoPA, authorized agents can only submit sale opt-out requests.
  • Universal opt-out requests. ColoPA also requires the Attorney General to establish technical specifications for a universal targeted advertising and sale opt-out (e.g., global privacy control) by July 1, 2023, which controllers must honor starting July 1, 2024. Note there also will be CPRA regulations on this point with compliance likely due by January 1, 2023. Unlike CPRA, which makes the global privacy control optional, controllers must comply with the universal opt-out under ColoPA.
  • Appealing consumer rights decisions. Like Virginia, ColoPA requires controllers to set up mechanisms permitting consumers to appeal a controller’s decision not to comply with a consumer’s request. The controller must then inform the consumer of its reasons for rejecting the request and also inform the consumer of his or her ability to contact the Attorney General “if the consumer has concerns about the result of the appeal.”
  • Requiring data protection assessments. Similar to GDPR, and consistent with the VCDPA, ColoPA requires data protection assessments (“DPAs”) for certain processing activities, namely, targeted advertising, sales, certain profiling, and processing of sensitive personal data. As with Virginia, the Colorado Attorney General has the right to request copies of a controller’s DPAs.
  • Consent for certain processing. Again following Virginia’s lead, ColoPA requires opt-in consent for the processing of sensitive personal information, which covers categories such as racial or ethnic origin, religious beliefs, citizenship, or genetic or biometric data used for uniquely identifying an individual. ColoPA also requires consent for processing children’s data, with a “child” being any individual under the age of 13. Unlike the VDCPA, ColoPA does not require COPPA-compliant consent for such processing, but ColoPA does exempt from the law personal data that is processed consistent with COPPA requirements.
  • Right to cure. ColoPA allows controllers to cure violations and is unique by establishing the longest right to cure, at 60 days, and also because the statute repeals the provision on January 1, 2025. By this date, the Attorney General may have established rules to issue opinion letters and guidance that businesses can rely on in good faith to defend an action that would otherwise violate the law. Such rules must go into effect by July 1, 2025.
  • Establishing controller duties. ColoPA establishes certain duties for controllers, including the duties of transparency, purpose specification, data minimization, care, avoiding unlawful discrimination, and duties regarding sensitive data. These duties create related obligations, such as providing a privacy policy, establishing security practices to secure personal data, and obtaining consent prior to processing sensitive data or children’s data.
  • Consent for secondary use. ColoPA also establishes a “duty to avoid secondary use.” This duty requires consent to process personal data for purposes “not reasonably necessary or compatible with” the original purposes for collection. This requirement suggests that businesses need to keep detailed records of the personal data that they are collecting, the purposes for initially collecting such personal data, confirm such purposes are consistent with disclosures made to consumers, and track the scope of consent in connection with such data uses.
ColoPA VCDPA CCPA
Thresholds to Applicability Conduct business in CO or produce products or services targeted to CO and (a) control or process personal data of at least 100,000 consumers; or (b) derive revenue or receive a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers Conduct business in or produce products or services targeted to VA and (a) control or process personal data of at least 100,000 consumers; or (b) derive over 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers Conduct business in CA and collect personal information of CA residents and: (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information
Consent Requires opt-in consent for processing sensitive personal data, including children’s data, and certain secondary processing Requires opt-in consent for processing sensitive personal data, and COPPA-compliant consent for processing children’s data Requires opt-in consent for sharing PI for cross-context behavioral advertising for children under 16, including parental consent for children under 13
Opt-Out Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for targeted advertising, sales, and profiling for legal or similarly significant effects Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information
Other Consumer Rights Access, Deletion, Correction, Portability Access, Deletion, Correction, Portability Access, Deletion, Correction, Portability
Authorized Agents Permitted for opt-out requests N/A Permitted for all requests
Appeals Must create process for consumers to appeal refusal to act on consumer rights Must create process for consumers to appeal refusal to act on consumer rights N/A
Private Cause of Action No No Yes, related to security breaches
Cure Period? 60 days until provision expires on Jan. 1, 2025 30 days No
Data Protection Assessments Required for targeted advertising, sale, sensitive data, certain profiling Required for targeted advertising, sale, sensitive data, certain profiling Annual cybersecurity audit and risk assessment requirements to be determined through regulations

Given the significant overlap among the three privacy laws, companies subject to ColoPA should be able to leverage VCDPA and CCPA implementation efforts for ColoPA compliance. If ColoPA is any example, other state privacy efforts may not veer too far from the paths VCDPA and CCPA have forged. The key will be to closely monitor how CalPPA and the Colorado Attorney General address forthcoming regulations and whether they add new distinct approaches for each state. Check back on our blog for more privacy law updates.

]]>
Smart (CA) TVs Are Listening: California Assembly Passes Voice Recognition Device Bill Headed to Senate https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/smart-ca-tvs-are-listening-california-assembly-passes-voice-recognition-device-bill-headed-to-senate https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/smart-ca-tvs-are-listening-california-assembly-passes-voice-recognition-device-bill-headed-to-senate Sat, 22 May 2021 12:29:50 -0400 Smart (CA) TVs Are Listening: California Assembly Passes Voice Recognition Device Bill Headed to Senate

The California Assembly recently passed AB-1262 updating an existing law to further limit the use of personal information collected through connected TVs and smart speaker devices. Specifically, the bill prohibits:

  • Operating a voice recognition feature of a connected TV or smart speaker without informing the consumer of the feature during setup or installation;
  • Using any recording or transcription collected through the feature that qualifies as personal information for any advertising purpose (unless deidentified);
  • Sharing such recordings or transcriptions, unless deidentified, with a third party without the consumer’s affirmative written consent; and
  • Retaining such non-deidentified recordings or transcriptions electronically without the consumer opting in to such retention during installation or otherwise in the device settings.
The bill adopts the same broad definition of “personal information” that is in the CCPA and separately defines “affirmative written consent,” detailing the specific language required for such consent. “Smart speaker devices” do not include cell phones, tablets, laptops with mobile data access, pagers, or motor vehicles.

California consumers would not have a private cause of action, but the state AG or district attorney would have the authority to seek penalties of up to $2,500 per connected TV or smart device sold or leased that violates the law. After passing the Assembly, the bill will now need to pass in the Senate and be signed by the Governor to become law. The bill was recently referred to the Senate’s Committee on Judiciary for review.

This bill signals the state’s continued focus on consumer privacy, as legislators continue to consider the privacy implications of smart devices and other technology, and the variety of ways in which companies use data for advertising purposes. We will continue to monitor the legislation and provide any updates.

Kelley Drye Ad Law Access Blog - www.adlawaccess.com

* * *

Subscribe here to Kelley Drye’s Ad Law Access blog and here for our Ad Law News and Views newsletter. Visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.

Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.

Follow us on LinkedIn and Twitter for the latest updates.

]]>
Senate Republicans Release COVID-19 Privacy Bill https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-covid-19-privacy-bill https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senate-covid-19-privacy-bill Thu, 07 May 2020 20:54:19 -0400 In light of concerns associated with attempts to use personal data to track the spread of COVID-19, a group of Republican Senators, led by Mississippi Senator Roger Wicker, introduced the COVID-19 Consumer Data Protection Act of 2020 today.

The bill imposes specific requirements on entities seeking to process precise geolocation data, proximity data, persistent identifiers, and personal health information (together, “covered data”) in association with COVID-19 mitigation efforts. Among other things, the Act would require:

  • Notice/Consent: Prior notice and affirmative express consent for the collection, processing, or transfer of covered data to track COVID-19, monitor social distancing compliance, or for COVID-19 contact tracing purposes;
  • Opt Out Rights: Giving individuals the right to opt out of such processing;
  • Deletion Rights: Deleting or de-identifying all covered data once the entity is no longer using it;
  • Data Processing Restrictions: A public commitment to limit the processing of the data, unless certain exceptions apply;
  • Notice: Posting a clear and conspicuous privacy policy within 14 days of the Act’s enactment that provides information about data transfers, data retention practices, and data security practices; and
  • Accountability: During the public health emergency, providing a bi-monthly public report identifying the aggregate number of individuals from whom the covered entity has collected, processed, or transferred covered data for COVID-19 purposes with additional detail about how and why that information was used.
The bill also requires covered entities to engage in data accuracy (including allowing the individual to report inaccuracies), data minimization, and data security practices. The FTC has enforcement authority under the bill and would also be required to release data minimization guidelines in relation to COVID-19 processing.

Separately, the bill explicitly exempts covered entities from requirements under the Communications Act or regulations in relation to this processing. The bill also preempts any similar state law, although state attorneys general have enforcement authority along with the FTC.

Whether Congress will pass the measure is unclear, as Democrats and public interest organizations have voiced concerns about the bill. Still, assuming Congress can agree, it’s worth monitoring to see whether the measure may be included in any upcoming COVID-19 relief bill.

Advertising and Privacy Law Resource Center

]]>
Data Privacy Considerations for Coronavirus Data Tools https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/data-privacy-considerations-for-coronavirus-data-tools https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/data-privacy-considerations-for-coronavirus-data-tools Sat, 28 Mar 2020 09:01:02 -0400 Data is helping governments, researchers, and companies across the world track the spread of the novel coronavirus, monitor cases and outcomes of COVID-19, and devise ways to halt the virus’s spread. As part of these efforts, raw data, software tools, data visualizations, and other efforts are providing the public and policymakers with insights into the growth of the pandemic.

Personal information — some of which may be highly sensitive — is key to many of these efforts. Although some regulators in the U.S. and abroad have made it clear that privacy laws and the exercise of enforcement discretion provide leeway to process personal information in connection with COVID-19, they have also made it clear that privacy laws continue to apply. Federal Trade Commission (FTC) Chairman Joe Simons advises that the FTC will take companies’ “good faith efforts” to provide needed goods and services into account in its enforcement decisions but will not tolerate “deceiving consumers, using tactics that violate well-established consumer protections, or taking unfair advantage of these uniquely challenging times.” And, with many eyes on the California Attorney General’s Office in light of recent requests to delay enforcement of the California Consumer Privacy Act (CCPA), an advisor to Attorney General Xavier Becerra was quoted as stating: “We’re all mindful of the new reality created by COVID-19 and the heightened value of protecting consumers' privacy online that comes with it. We encourage businesses to be particularly mindful of data security in this time of emergency.”

Devoting some thought to privacy issues at the front end of COVID-19 projects will help to provide appropriate protections for individuals and address complications that could arise further down the road. This post identifies some of the key privacy considerations for contributors to and users of COVID-19 resources.

1. Is Personal Information Involved?

Definitions of “personal information” and “personal data” under privacy laws such as the CCPA and the EU’s General Data Protection Regulation (GDPR) are broad. Under the CCPA, for example, any information that is “reasonably capable of being associate with, or could reasonably be linked” with an individual, device, or household is “personal information.” This definition specifically includes “geolocation data.” Although some data sources provide COVID-19-related information at coarse levels of granularity, e.g., county, state, or national level, the broad definition of “personal information” under the CCPA, GDPR, and other privacy laws makes it worth taking a close look at geographic and other types of information to determine whether the data at issue in fact reasonably qualifies as “personal information,” or if it is sufficiently anonymized to meet privacy definitions of de-identified and/or aggregate data. CCPA, HIPAA, and other privacy laws provide examples of what safeguards are expected to reasonably treat data as anonymized, and employing such standards can help avoid unnecessary privacy mishaps despite well-intentioned efforts.

2. What Level(s) of Transparency Are Appropriate About the Data Practices?

Although some COVID-19 tools may be exempt from statutory requirements to publish a privacy policy (e.g., the provider of the tool is not a “business” under the CCPA), there are still reasons for providers to explain what data they collect and how they plan to use and disclose the data:

  • Disclosures help individuals to reach informed decisions about whether they want to provide their data, e.g., by downloading an app and allowing it to collect their location and other information. If business practices and consumer expectations are not reasonably aligned around the data practices, the failure to provide an appropriate privacy notice could be deemed an unfair or deceptive practice, inviting the scrutiny of the FTC or State Attorneys General.
  • Developing a privacy policy (or other disclosure) can help provide internal clarification on what types of personal information (or not) an app or service needs and collects. A granular understanding of such data practices can help providers to identify and mitigate privacy and data security risks associated with such data practices.
  • Developing a disclosure about a provider’s data collection and usage can help clarify the decision-making structure among multiple stakeholders so that the group is better equipped to handle data governance decisions over the lifecycle of a project.
3. How to Address Government Requests/Demands for Personal Information?

Although much remains to be seen in how federal, state, and local governments will use personal information (if at all) to develop and implement strategies to slow the spread of coronavirus, it is not unreasonable to expect that government agencies will seek information from providers of COVID-19-related tools. The extent to which a provider can voluntarily provide information to the government — as well as the procedures that the government must follow to compel the production of information (and maintain the confidentiality of it in personally identifiable form) — depends on several factors, including what kind of information is at issue and how it was collected. Becoming familiar with the rules that apply to voluntary and compelled disclosures, and safeguards to help prevent such data from being subject to broad freedom of information laws, before a request arrives can help save valuable time down the road. In many of these scenarios, for example, aggregate or pseudonymous data may be sufficient.

4. What Considerations Are There for Licensing COVID-19-Related Personal Information?

Finally, any licensing of personal information in connection with COVID-19 tools deserves careful consideration, particularly if the CCPA applies. The CCPA imposes notice and opt-out requirements on entities that “sell” personal information. “Sell” is defined to include disseminating, disclosing, or otherwise “making available” personal information to for-profit third parties in exchange for “monetary or other valuable consideration.” Several types of open source licenses require users to accept certain restrictions on their use and/or redistribution of licensed data or software. For example, the Creative Commons Attribution-NonCommercial 4.0 International license requires licensees to agree (among other conditions) not to use licensed content for commercial purposes. Obtaining this promise in exchange for personal information could constitute “valuable consideration” and give rise to a “sale” under the CCPA. In addition, while not a “sale,” sharing personal information with a government authority would qualify as a disclosure under CCPA and would need to be accurately disclosed in the privacy policy.

Neither the California Attorney General nor the courts have interpreted the CCPA in the context of open source licenses. Until more authoritative guidance becomes available, it makes sense to think through the potential obligations and other consequences of applying and accepting specific license terms to COVID-19-related personal information.

Bottom line: Personal information has a key role to play in shaping responses to the novel coronavirus. Privacy laws remain applicable to this information. Applying privacy considerations to COVID-19 related practices involving data collection, sharing, and analysis will help mitigate unnecessary harms to consumers, aside from those presented by the virus itself.

For other helpful information during this pandemic, visit our COVID-19 Resource Center.

]]>
Be Careful What You Say About the CCPA https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/be-careful-what-you-say-about-the-ccpa https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/be-careful-what-you-say-about-the-ccpa Tue, 12 Nov 2019 11:26:25 -0500 As privacy and personal data issues continue to be a focus of both legal action and media coverage, privacy policy statements are getting dusted off and reviewed by more eyes. Imprecise or inaccurate policy statements, themselves, can expose a company to potential liability. While most of the recent California Consumer Privacy Act (“CCPA”) attention has focused on the significant operational requirements, data flow classifications, attorney general future enforcement, and the limited private right of action for data breaches, perhaps the largest near-term CCPA risk issue will be how the law overlaps with other California consumer protection statutes, and litigation efforts focusing on alleged inaccuracy or deception based on the public statements companies make about their privacy practices.

CCPA’s Limited Private Right of Action

The Attorney General’s Office was granted wide discretion and enforcement powers to impose fines of up to $2,500 for unintentional violations and up to $7,500 for each intentional violation. Cal. Civ. Code 1798.155. The CCPA, however, provides for only limited private right of action for individual consumers related to data security breaches. Cal. Civ. Code 1798.150. Plaintiffs can recover actual damages or statutory damages of $100 to $750. A broader potential private right of action was considered and would have permitted individuals to sue for any and all CCPA violations. SB 561. But that amendment failed to pass in May.

Where There’s a Will, There’s a Way?

But anyone expecting that companies will only face privacy-related consumer litigation in the context of a data breach is under-selling the risk. While direct actions under the CCPA may be limited, the requirements of the CCPA may serve as the basis for claims under other consumer protection statutes. And, importantly, the public statements and policies that companies issue will be scrutinized not just for their actual compliance, but for whether companies are fulfilling their own promises. Indeed, nothing prevents individuals from filing putative consumer class action claims alleging false statements, unfair business practices, or misleading conduct on behalf of companies in connection with their privacy policies and practices.

What Types of Claims Are Likely to be Filed?

These claims are likely to be brought pursuant to other California consumer protection statutes, such as California’s Unfair Competition Law (Bus. & Prof. Code 17200), False Advertising Law (Bus. & Prof. Code 17500), and Consumer Legal Remedies Act (Civ. Code 1750). For example:

  • Section 17200 prohibits “any unlawful, unfair or fraudulent business act or practice and unfair, deceptive, untrue or misleading advertising.” Put differently, a violation of any other California law, including the CCPA, can serve as the basis for a claim. That is true even where that underlying statute does not, itself, give rise to a private right of action.
  • Similarly, Section 17500 can give rise to a claim based on by disseminating untrue or misleading statements concerning the performance of services. That would include statements made concerning the collection, use, handling, storage, dissemination, or destruction of personal information in connection with a business’s activities.
  • Finally, the CLRA prohibits a broad range of representations and statements concerning a company’s policies, procedures, and services. In addition to actual damages, the statute also permits for recovery of punitive damages and recovery of attorney’s fees.
Courts have found that violations of internal policies and/or statements concerning those policies provide sufficient foundation for such actions. See, e.g., In re Adobe Sys., Inc. Privacy Litig., 66 F. Supp. 3d 1197 (N.D. Cal. 2014) (plaintiffs’ allegations that they relied on Adobe’s claims that personal data would be protected sufficient to establish UCL standing); Smith v. Chase Mortg. Credit Grp., 653 F. Supp. 2d 1035, 1045-46 (E.D. Cal. 2009) (concluding that defendant’s alleged violation of internal policy provides basis for unfairness claim).

Precision in Privacy Promises

These risks are a good reminder that it is critical not just to have the CCPA required disclosures in privacy statements and communications in response to consumer rights requests, but also to be vigilant and precise about the descriptions of privacy practices and how the company is honoring the rights requests. In the end, a company’s statements about its CCPA compliance could end up triggering potential exposure far greater than anything available under the CCPA itself.

]]>
CCPA Update: California Senate Committee Approves Privacy Law Amendments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ccpa-update-california-senate-committee-approves-privacy-law-amendments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ccpa-update-california-senate-committee-approves-privacy-law-amendments Tue, 13 Aug 2019 10:40:48 -0400 Amendments to the California Consumer Privacy Act (CCPA) continued to advance on Monday, as the California legislature returned from its summer recess. With just five weeks to go until the September 13th deadline for the legislature to pass bills, and fewer than five months until the CCPA is set to take effect, the Senate Appropriations Committee gave the greenlight to six bills: AB 25, AB 846, AB 1564, AB 1146, AB 874, and AB 1355. The bills were ordered to a “second reading,” meaning they head to the Senate floor for consideration without a further hearing in the Senate Appropriations Committee. Two of those bills, AB 874 and AB 1355, will be placed on the Senate’s consent calendar, because they have not been opposed.

The Senate Appropriations Committee also voted to advance AB 1202, the data broker amendment, but placed the bill in the Committee’s suspense file. This procedural action holds bills that will have a significant fiscal impact on the State of California’s budget for consideration all at once to ensure that fiscal impacts are considered as a whole.

Here’s the full list of amendments as of August 12, 2019:

Ordered to Second Reading in the California Senate

  • EMPLOYEE EXEMPTION: Assembly Bill 25changes the CCPA so that the law does not cover collection of personal information from job applicants, employees, business owners, directors, officers, medical staff, or contractors.
  • LOYALTY PROGRAMS:Assembly Bill 846 provides certainty to businesses that certain prohibitions in the CCPA would not apply to loyalty or rewards programs.
  • CONSUMER REQUEST FOR DISCLOSURE METHODS:Assembly Bill 1564 requires businesses to provide two methods for consumers to submit requests for information, including, at a minimum, a toll-free telephone number. A business that operates exclusively online and has a direct relationship with a consumer from whom it collects personal information is only required to provide an email address for submitting CCPA requests.
  • VEHICLE WARRANTIES & RECALLS: Assembly Bill 1146 exempts vehicle information retained or shared for purposes of a warranty or recall-related vehicle repair.
  • PUBLICLY AVAILABLE INFORMATION: Assembly Bill 874streamlines the definition of “publicly available” to mean information that is lawfully made available from federal, state, or local government records. The bill also seeks to amend the definition of “personal information” to exclude deidentified or aggregate consumer information.
  • CLARIFYING AMENDMENTS:Assembly Bill 1355 exempts deidentified or aggregate consumer information from the definition of personal information, among other clarifying amendments.
Placed on Suspense File of the Senate Committee on Appropriations
  • DATA BROKER REGISTRATION: Assembly Bill 1202requires data brokers to register with the California Attorney General.

]]>
California Senate Committee Blesses Majority of CCPA Amendments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/california-senate-committee-blesses-majority-of-ccpa-amendments https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/california-senate-committee-blesses-majority-of-ccpa-amendments Mon, 15 Jul 2019 10:08:31 -0400 Seven amendments to the California Consumer Privacy Act (CCPA) are one step closer to becoming law after the California Senate Committee on the Judiciary voted to advance the legislation earlier this month.

The bills now head to the Committee on Appropriations for a vote next month, followed by a vote of the full Senate. The legislature has until September 13, 2019 to pass bills.

The most consequential and anticipated of the amendments, A.B. 25, A.B. 846, A.B. 1202, A.B. 1564, and A.B. 1146, were changed by the Judiciary Committee. That means they will require the consent of the California Assembly before they can head to the governor’s desk for a final signature.

By comparison, two technical amendments, A.B. 874 and A.B. 1355, were advanced without changes by the Judiciary Committee. If approved in their current form by the full Senate, these amendments will move directly to the governor for a signature.

Three CCPA amendments failed to secure approval by the Judiciary Committee and are unlikely to further advance. These are A.B. 1416, A.B. 873, and A.B. 981.

Here’s the full list of CCPA amendments:

Amended and Approved by the Judiciary Committee

  • EMPLOYEE EXEMPTION: Assembly Bill 25 changes the CCPA so that the law does not cover collection of personal information from job applicants, employees, business owners, directors, officers, medical staff, or contractors.
What’s New? The Senate Committee weakened the employee exception by sunsetting the exemption on January 1, 2021, and negating the exemption with regard to the CCPA’s notice and data breach liability provisions.
  • LOYALTY PROGRAMS: Assembly Bill 846 provides certainty to businesses that certain prohibitions in the CCPA would not apply to loyalty or rewards programs.
What’s New? The bill was amended to prohibit a business from selling personal information of consumers collected as part of a loyalty, rewards, discount, premium features, or club card program.
  • DATA BROKER REGISTRATION: Assembly Bill 1202 requires data brokers to register with the California Attorney General.
What’s New? The amendment dropped language that would have provided consumers the right to opt-out of the sale of their personal information by data brokers.
  • CONSUMER REQUEST FOR DISCLOSURE METHODS: Assembly Bill 1564 requires businesses to provide two methods for consumers to submit requests for information, including, at a minimum, a toll-free telephone number. A business that operates exclusively online and has a direct relationship with a consumer from whom it collects personal information is only required to provide an email address for submitting CCPA requests.
What’s New? The original Assembly amendment proposed to allow a business to provide consumers with either a toll-free number or an email address and physical address. The Senate bill brings back the toll-free number requirement, but exempts online-only businesses from operating a toll-free number.
  • VEHICLE WARRANTIES & RECALLS: Assembly Bill 1146 exempts vehicle information retained or shared for purposes of a warranty or recall-related vehicle repair.
What’s New? The bill was amended to more clearly describe vehicle recalls.

Approved by the Judiciary Committee

  • PUBLICLY AVAILABLE INFORMATION: Assembly Bill 874 streamlines the definition of “publicly available” to mean information that is lawfully made available from federal, state, or local government records. The bill also seeks to amend the definition of “personal information” to exclude deidentified or aggregate consumer information.
  • CLARIFYING AMENDMENTS: Assembly Bill 1355 exempts deidentified or aggregate consumer information from the definition of personal information, among other clarifying amendments.
Failed to Secure Approval
  • GOVERNMENT REQUESTS; FRAUD EXCEPTION: Assembly Bill 1416 creates exceptions for businesses complying with government requests; provides exceptions for the sale of information for detection of security incidents or fraud.
  • AMENDMENTS TO DEFINITIONS: Assembly Bill 873 broadens the definition of “deidentified” and clarifies that “personal information” includes information that “is reasonably capable of being associated with” a consumer or household.
  • INSURANCE EXEMPTIONS: Assembly Bill 981 exempts insurance institutions, agents, and insurance-support organizations (i.e., organizations assembling or collecting information about natural persons for the primary purpose of providing the information to an insurance institution or agent for insurance transactions) from complying with CCPA.
Our team will continue to track any new developments in the California Senate as these bills continue to be reviewed by the legislature.

]]>
Senators Introduce Bipartisan Effort to Regulate Health Apps, Biometrics & Wearables https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senators-introduce-bipartisan-effort-to-regulate-health-apps-biometrics-wearables https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/senators-introduce-bipartisan-effort-to-regulate-health-apps-biometrics-wearables Tue, 25 Jun 2019 16:11:02 -0400 A new bill introduced in the Senate Health, Education, Labor, and Pensions (HELP) Committee would impose federal regulatory obligations on health technology businesses that collect sensitive health information from their service users and customers.

The Protecting Personal Health Data Act, S.1842, introduced by Senators Amy Klobuchar (D-Minn.) and Lisa Murkowski (R-Alaska), seeks to close a growing divide between data covered by the Health Insurance Portability and Accountability Act (HIPAA) and non-covered, sensitive personal health data.

More specifically, the bill would regulate consumer devices, services, applications, and software marketed to consumers that collect or use personal health data. This would include genetic testing services, fitness trackers, and social media sites where consumers share health conditions and experiences. Often, these technologies and services are run independent from traditional, HIPAA healthcare operations involving hospitals, healthcare providers, and insurance companies.

The bill directs the U.S. Department of Health and Human Services (HHS) to promulgate rules that would strengthen the privacy and security of such personal health data. The bill contemplates that the new rule would:

  • Set appropriate uniform standards for consent related to handling of genetic data, biometric data, and personal health data;
  • Include exceptions for law enforcement, research, determining paternity, or emergency medical treatment;
  • Set minimum security standards appropriate to the sensitivity of personal health data;
  • Set limits on the use of the personal health data;
  • Provide consumers with greater control over use of personal health data for marketing purposes; and
  • Create rights to data portability, access, deletion, and opt-outs.

Inevitably, the success or failure of the legislation will be tied to federal baseline privacy legislation already pending in Congress. Those efforts are ongoing, but have lost momentum in recent months as focus turns to California’s new privacy law taking effect on January 1, 2020.

]]>
Nevada and Maine Advance Legislation Addressing the “Sale” of Personal Data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/nevada-and-maine-advance-legislation-addressing-the-sale-of-personal-data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/nevada-and-maine-advance-legislation-addressing-the-sale-of-personal-data Tue, 04 Jun 2019 14:37:34 -0400 While businesses rightfully have been focused on preparing for the California Consumer Privacy Act (“CCPA”), the Nevada and Maine Legislatures have moved forward with legislation that, like the CCPA, features new requirements relating to the sale of consumer personal data. The Nevada bill, which was signed into law on May 29 and amends an existing data privacy statute, requires companies to provide a designated channel through which consumers can opt out of the sale of their personal data. The Maine bill, which has passed house and senate votes, notably would require opt-in consent prior to the sale of personal data; however, the law would narrowly apply to Internet Service Providers (“ISPs”) and exclude online companies perhaps more commonly associated with the disclosure and sale of consumer data.

• Nevada

Nevada’s SB 220 amends the state’s existing online privacy notice statute, NRS 603A.300 to .360, to add a provision that requires “operators” – which include most companies that conduct business online with Nevada residents – to comply with a consumer’s do-not-sell request (health care and financial institutions subject to HIPAA and GLBA are out of scope of the law). As of the October 1, 2019 effective date, operators are required to create a “designated request address,” such as an email address, toll-free number, or website, through which consumers can submit a “verified request” to restrict the sale of covered data. A “verified request” is one where the operator can reasonably verify the authenticity of the request and the consumer’s identity using “commercially reasonable means,” which the law does not define.

The personal information covered under the law includes personal data such as name, address, and SSN, as well as online contact information, and any other data collected by the company that could be viewed as personally identifiable. Notably, the law defines “sale” more narrowly than the CCPA to include the exchange of covered information for “monetary consideration” to a person “for the person to license or sell the covered information to additional persons.”

Operators will have 60 days to respond to a consumer’s do-not-sell request, though this timeline may be extended by up to 30 days where the operator deems it necessary and notifies the consumer. The provision will be enforced by the Nevada Attorney General’s Office, which can impose a penalty of up to $5,000 per violation.

• Maine

The bill advanced by the Maine Legislature, titled “an Act to Protect the Privacy of Online Customer Information,” would among other things prohibit ISPs’ use, disclosure, and sale of “customer personal information” without a customer’s opt-in consent, except under limited circumstances such as to provide the requested service, to collect payment, and several other narrow scenarios. Customer personal information subject to the law broadly would include (1) personally identifiable information about an ISP customer; and (2) information relating to a customer’s use of broadband Internet access service, including web browsing history, app usage, device identifiers, geolocation data, and other usage information. ISPs also would be prohibited from making the sale of data mandatory under the applicable terms of service, or refusing service to customers who do not consent to data collection.

The bill is an attempt to restore at the state level core provisions within the FCC’s 2016 broadband order that were repealed by Congress in 2017. The Maine State Chamber of Commerce has opposed the bill, claiming that ISPs are being unfairly singled out, and arguing that the law would result in a false sense of privacy for consumers given that large web-based companies such Facebook and Google would not be subject to the law. The Governor still must sign the final legislation, which would take effect July 1, 2020.

]]>
Doing Business in India? Keep an Eye on This…. https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/doing-business-in-india-keep-an-eye-on-this https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/doing-business-in-india-keep-an-eye-on-this Fri, 01 Mar 2019 17:24:33 -0500 The draft National E-Commerce Policy (“Draft Policy”) released by the Government of India on February 23, 2019 for stakeholder comments, has left the e-commerce sector in jitters. For global market players, the protectionist construct of the Draft Policy seems to suggest a shift of India’s focus from ‘Ease of Doing Business in India’ to ‘Make in India’. If the Draft Policy is implemented in its present form, it may have a serious impact demanding drastic change in internal strategies, policies and cost allocations for foreign companies having e-commercial presence in India. The Draft Policy is open for stakeholder comments up to March 9, 2019.

The Draft Policy focuses on: (i) restriction on cross-border flow of data; (ii) local presence and taxability of foreign entities having significant economic presence in India; (iii) creating a robust digital infrastructure for e-commerce, from online custom clearance to online resolution of consumer complaints; (iv) promoting exports from India with a boost to start-ups and small firms; and (v) regulatory changes to augment economic growth in e-commerce.

The key highlights of the Draft Policy are as follows:

1. Local Control over Data: The Draft Policy identifies data as an asset and provides that Indian citizens and companies should get economic benefits against monetization of India’s data. The Draft Policy has suggested the following measures in this regard:

(i) To create a legal and technological framework for cross-border flow of data as well as domestic data flow for public policy, for (a) data collected by public-installed ‘Internet of Things’ (IOT) devices; and (b) data of users generated over e-commerce and social media platforms.

(ii) It prohibits disclosure of sensitive data stored by business entities outside India, with any other business entity or third party despite consumer consent. The non-compliance of these obligations may lead to consequences to be prescribed by the Government.

(iii) It has exempted certain types of cross-border data flow from restrictions: (a) data not collected in India; (ii) B2B data transfers outside India under a commercial contract; (c) software and cloud computing services which have no personal or community impact; and (d) multinational corporations moving internal data which is not collected from social media, e-commerce platforms or search engines, etc.

(iv) The Draft Policy also suggests community sharing of data with start-ups for larger public interest, with the help of a ‘Data Authority,’ which is to be established.

2. Digital Infrastructure: The Draft Policy aims at making India data-storage ready by providing infrastructure guidelines for local data centers, server farms, etc. and shift to a digital regime for services including government services. The Draft Policy has provided for a sunset period of three years to the industry to adjust to the data storage requirements and infrastructure set-up. It also suggests creating domestic alternatives to foreign cloud based and email services. However, there is no blanket or express ban on overseas data storage, other than as discussed above.

3. Foreign Investment (FDI) in E-Commerce Sector: The Draft Policy is in line with the present FDI which permits FDI only in market place models and not inventory-based models.

4. Mandatory Registration in India: Amongst other strict anti-counterfeit and anti-piracy measures suggested by the Draft Policy, it requires e-commerce sites/apps available for download in India, to be registered in India as the importer on record or as the entity through which all sales in India are transacted. A local representative is also required to be appointed by entities.

5. Other Suggested Regulatory Changes: Other regulatory changes suggested by the Draft Policy include: (i) reduction in advertising charges in e-commerce for start-ups and small enterprises; (ii) Government may reserve its right for disclosure of source code and algorithms especially in AI technology platforms; (iii) special status to be afforded to start-ups and small enterprises vis-à-vis data access; (iv) suggested amendment to the Indian Income Tax Act to include ‘significant economic presence’ as a means to determine permanent establishment in India and allocation of profits of multinationals; (v) levying customs duties on electronic transmissions; (vi) addressing digital payment and other financial issues including privacy of information, inherent in e-commerce sector; (vii) increased responsibility and liability of e-commerce and social media platform as to genuineness of information posted on their sites; and (viii) establishment of e- consumer courts to address grievances/ disputes online.

****

This post was written by Rahul Mahajan and Anushka Jasuja from our affiliate office in Mumbai. Learn more about our India practice here.

]]>
FTC Files Lawsuit Against Taiwanese Manufacturer for Alleged Lax Security in Wireless Routers and Cameras and Related Marketing Claims https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-files-lawsuit-against-taiwanese-manufacturer-for-alleged-lax-security-in-wireless-routers-and-cameras-and-related-marketing-claims https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/ftc-files-lawsuit-against-taiwanese-manufacturer-for-alleged-lax-security-in-wireless-routers-and-cameras-and-related-marketing-claims Fri, 06 Jan 2017 12:42:47 -0500 The Federal Trade Commission has filed a lawsuit in federal court claiming that a networking equipment manufacturer engaged in unfair and deceptive acts, exposing thousands of consumers to the risk of cyberattack from vulnerable wireless routers and internet cameras.

The complaint against Taiwan-based networking equipment manufacturer D-Link Corporation and its U.S. subsidiary D-Link Systems alleges that the companies failed to take reasonable steps to protect the internet routers and IP cameras from “widely known and reasonable foreseeable” vulnerabilities. According to the complaint, these risks were not purely theoretical: D-Link equipment has been compromised by attackers, including being made part of “botnets,” which are large-scale networks of computers infected by malicious software.

In particular, the complaint alleges that the company failed to take steps to address well-known and easily preventable security flaws, such as:

  • “hard-coded” login credentials integrated into D-Link camera software -- such as the username “guest” and the password “guest” -- that could allow unauthorized access to the cameras’ live feed;
  • a software flaw known as “command injection” that could enable remote attackers to take control of consumers’ routers by sending them unauthorized commands over the Internet;
  • the mishandling of a private key code used to sign into D-Link software, such that it was openly available on a public website for six months; and
  • leaving users’ login credentials for D-Link’s mobile app unsecured in clear, readable text on their mobile devices, even though there is free software available to secure the information.
Count I of the complaint alleges that D-Link’s failure to take reasonable measures to secure the products from these vulnerabilities was unfair under Section 5 of the FTC act. It alleges that D-Link’s practices caused, or are likely to cause, substantial injury to consumers that is not outweighed by countervailing benefits to consumers or competition and is not reasonably avoidable by consumers.

But the FTC is not only concerned with the potential vulnerabilities of the D-Link products; in Counts II through VI, the FTC alleges that D-Link violated Section 5(a) of the FTC Act by making deceptive statements about the products’ security. These allegedly deceptive statements include the following:

Count II: D-Link advertised a Security Event Response Policy, implying that D-Link had taken reasonable measures to secure the products from unauthorized access;

Count III: In promotional materials, D-Link claimed that its routers were “EASY TO SECURE” and had “ADVANCED NETWORK SECURITY,” among other claims, implying that the routers were secure from unauthorized access and control;

Count IV: In promotional materials, D-Link advertised that its cameras provided a “secure connection,” among other claims, implying that the cameras were secure from unauthorized access and control;

Count V: To begin using the routers, a graphical user interface provided security-related prompts such as “To secure your new networking device, please set and verify a password below,” implying that the routers were secure from unauthorized access and control; and

Count VI: To begin using the cameras, a graphical user interface provided security-related prompts such as “Set up an Admin ID and Password” or “enter a password” in order “to secure your camera” and featured a lock logo, implying that the cameras were secure from unauthorized access and control.

In a press release announcing the lawsuit, FTC Bureau of Consumer Protection Director Jessica Rich commented, “When manufacturers tell consumers that their equipment is secure, it’s critical that they take the necessary steps to make sure that’s true.”

The Commission vote authorizing the staff to file the complaint was 2-1, with Commissioner Maureen K. Ohlhausen voting against the complaint. The complaint was filed in the U.S. District Court for the Northern District of California.

The complaint is just the most recent action in the FTC’s efforts to crack down on potential vulnerabilities in the Internet of Things (IoT). The FTC has also brought enforcement actions against ASUS over allegedly insecure routers and cloud services and against TRENDnet over its allegedly insecure cameras. This case serves as yet another reminder that the FTC remains focused on cyber security, especially for IoT devices, and that it is important for all businesses that handle or have access to customer information to ensure that they have implemented reasonable security practices, and confirmed the accuracy of all related marketing claims and public representations (including in public-facing policies and product dashboards) about the security of their products.

]]>
FCC Votes to Impose Aggressive New Privacy Rules on Broadband Providers https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/fcc-votes-to-impose-aggressive-new-privacy-rules-on-broadband-providers https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/fcc-votes-to-impose-aggressive-new-privacy-rules-on-broadband-providers Sat, 29 Oct 2016 12:04:53 -0400 iStock_000019536561Large-300x225At the Federal Communications Commission’s (“FCC”) Open Meeting on October 27, the Commission voted along party lines (3-2) to impose more stringent rules on broadband Internet service providers (“ISPs”). Chairman Tom Wheeler, along with Commissioners Rosenworcel and Clyburn voted in favor of the item, while Commissioners Pai and O’Rielly voted against it.

The new rules clarify the privacy requirements applicable to broadband ISPs pursuant to Section 222 of the Communications Act. The new rules also apply to voice services and treat call-detail records as “sensitive” in the context of voice services. According to an FCC press release issued immediately after the meeting, these rules “establish a framework of customer consent required for ISPs to use and share their customers’ personal information that is calibrated to the sensitivity of the information.” The Commission further asserts that this approach is consistent with the existing privacy framework of the Federal Trade Commission (“FTC”). The actual text of the order is not yet available, but a fact sheet and press release outline the core components of the order. Under the order, mobile and fixed broadband ISPs will apparently be subject to the following requirements:
  • Opt-in: ISPs must obtain affirmative consent from consumers to use and share “sensitive” information. Under the new rules, the following categories of information are included as sensitive: precise geo-location information, financial information, health information, children’s information, Social Security numbers, web browsing history, app usage history, and the contents of communications.
  • Opt-out: ISPs can use and share “non-sensitive” information unless a customer “opts out.” All other individually identifiable customer information is considered “non-sensitive,” and may be used by ISPs consistent with consumer expectations.
  • Exceptions to Consent: Customer consent can be inferred for certain purposes specified in the statute, such as provision of broadband service, billing and collection, and marketing of services and equipment that are ancillary to broadband service. In such cases, no further consent is required beyond creation of the customer-ISP relationship.
  • Notice & Transparency: ISPs must notify customers about the types of information collected, the uses that could be made of such information, and with whom such information may be shared. Although ISPs must provide such information to customers from the outset, it is a continuing obligation – ISPs must update their customers of material changes to their privacy policies and make such information persistently available on their website or mobile app. Moreover, in response to contemporary “pay for privacy” controversies, the Commission will impose heightened disclosure requirements where ISPs offer discounts in exchange for greater rights to use customer information. Finally, the Commission has directed its Consumer Advisory Committee to develop a privacy notification standard that will afford a safe harbor to adopting providers.
The rules also address other issues, including the following:
  • Data Protection: The new rules impose requirement that ISPs utilize reasonable data security measures. To fulfill said requirement, ISPs may: a) adopt current industry best practices; b) provide accountability and oversight for security practices; c) use robust customer authentication tools; and d) conduct data disposal consistent with FTC best practices and the Consumer Privacy Bill of Rights.
  • Breach Response: ISPs must notify affected customers of breaches within 30 days of the determination of a breach. They must notify the Commission, FBI, and Secret Service within 7 business days if a breach affects 5,000 or more customers. If a breach affects fewer than 5,000 customers, the ISPs must contemporaneously notify the Commission and affected customers (within 30 days).
The Rationale: Consumer Rights and Technological Change In the fact sheet, the Commission states that ISPs serve as “a consumer’s ‘on-ramp’ to the Internet,” observing that “[p]roviders have the ability to see a tremendous amount of their customers’ personal information that passes over that Internet connection” and asserting that consumers should have the right to decide how such information is used and shared. The Commission intends for the rules “to evolve with changing technologies and encourage innovation.” De-identified Information: ISPs may utilize de-identified information without consumer consent. De-identified information consists of data sets that have been modified so that they can no longer be traced to individual users or devices. However, in recognition of the fact that ISPs might otherwise have the ability and incentive to re-identify customer information, the order adopts a three-part test which the FTC created in 2012 to determine whether de-identified information may be shared without consumer consent. Pursuant to this framework, in order for an ISP to rely on de-identification without notice and consent, it must:
  1. Alter customer information so that it cannot reasonably be linked to a specific individual or device.
  2. Publicly commit to (a) maintain and use the data in an unidentifiable format and (b) make no efforts to re-identify the information.
  3. Contractually prohibit re-identification of shared information.
Consumer Empowerment Efforts: Ending Contracts of Adhesion and Enabling Dispute Resolution: The Commission appears to be concerned with the bargaining power differential between customers and providers. In an effort to give consumers greater leverage, the order bans ISPs from “take-it-or-leave-it” offers and forces them to serve customers who do not consent to the commercial use or dissemination of their information. The order also purportedly addresses a recent controversy over mandatory arbitration clauses in ISP-consumer contracts by reiterating the right of consumers to utilize the Commission’s informal dispute resolution process, and signals the Commission’s intent to more directly address the matter in a rulemaking in February 2017.

* * *

The Broadband Privacy Order is an important and controversial decision. Commissioner Rosenworcel touted the rules as “real privacy control for consumers.” Commissioner Clyburn similarly praised the benefits of the rules for consumer protection, but openly acknowledged that “with respect to the future of privacy, I think we still have work to do” and saw the need for further harmonization efforts vis-a-vis the FTC. Commissioners Pai and O’Rielly both voiced strong dissents. Commissioner Pai emphasized that these rules were out of sync with FTC standards, warned that “[n]othing in these rules will stop edge providers from harvesting and monetizing your data,” and expressed concern that the order sets forth “one-sided rules that will cement edge providers’ dominance in the online advertising market.” Commissioner O’Rielly expressed frustration with the order’s new opt-in requirements, stating that the use of an “opt-in consent mechanism results in far fewer individuals conveying their consent than is the case under an opt-out consent mechanism even when substantial benefits are at stake.” FTC Chairwoman Edith Ramirez released a statement praising the order: “I am pleased that the Federal Communications Commission has adopted rules that will protect the privacy of millions of broadband users. The rules will provide robust privacy protections, including protecting sensitive information such as consumers’ social security numbers, precise geolocation data, and content of communications, and requiring reasonable data security practices. We look forward to continuing to work with the FCC to protect the privacy of American consumers.” Although the full order has yet to be released, at a press conference following the meeting, Chairman Wheeler indicated there was a relatively strong chance it will be released at some point in the next 24 to 48 hours. Details aside, it is clear that today’s decision (if upheld) will change the communications and privacy landscape. We will post updates here as we learn more about the new Broadband Privacy rules. Finally, Kelley Drye will soon offer a free webinar that will unpack the new order. Once the full order becomes available, we will announce the date and time of the webinar. ]]>
What You Need to Know About Privacy Shield: An Overview of the New Transatlantic Framework https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/what-you-need-to-know-about-privacy-shield-an-overview-of-the-new-transatlantic-framework https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/what-you-need-to-know-about-privacy-shield-an-overview-of-the-new-transatlantic-framework Fri, 15 Jul 2016 14:05:36 -0400 On July 12, 2016, the European Commission (“Commission”) formally adopted and released the Privacy Shield Adequacy decision, which will allow certified U.S. companies to transfer EU personal data to the United States. The EU-U.S. Privacy Shield (“Privacy Shield”) replaces the U.S.-EU Safe Harbor framework (“Safe Harbor”), which was invalidated in October 2015 by the European Court of Justice (“ECJ”) in Maximillian Schrems v Data Protection Commissioner. The decision will immediately go into effect upon notification to the EU Member States.

The more than 4,400 U.S. companies that previously relied on the Safe Harbor and have been waiting for an alternative mechanism for data transfers can choose to self-certify to the Department of Commerce (“Commerce”) under the new Privacy Shield framework. Commerce will begin accepting Privacy Shield applications on August 1, 2016. This client advisory provides an overview of Privacy Shield, highlights key differences between Privacy Shield and Safe Harbor, and offers some key considerations given the forthcoming Global Data Protection Regulation and other data privacy developments.

Privacy Shield
The Commission’s adequacy decision found that U.S. mechanisms and regulations under Privacy Shield provide an adequate level of protection for international data transfers. Since October 2015, the Commission and U.S. authorities have negotiated Privacy Shield, which has undergone substantive changes since the Commission issued a draft adequacy decision on February 29, 2016. The key elements of Privacy Shield include:
  • Robust enforcement and strong obligations on companies handling Europeans’ personal data. Commerce will play a more significant role in monitoring certification and ensuring that false claims of Privacy Shield participation are appropriately sanctioned. Companies that withdraw from Privacy Shield must annually affirm to Commerce certification to the Privacy Shield principles until the personal data are returned or deleted. Notably, accountability for onward transfers has changed. Now companies must comply with notice and choice principles for onward transfers in addition to entering into contractual agreements to guarantee the same level of protection.
  • Clear safeguards and transparency obligations on U.S. government access. Under Privacy Shield, the U.S. has committed to a new oversight mechanism for national security interference in the form of an Ombudsperson that will be independent from the U.S. intelligence authorities. Relying on Presidential Policy Directive 28, the USA Freedom Act, and written assurances by the U.S. government, the Commission concluded that Privacy Shield effectively protects EU citizens against generalized access to personal data.
  • Effective protection of EU citizens’ rights with several redress possibilities: EU data subjects may file complaints directly with a U.S. self-certified company, with a free-of-charge independent dispute resolution body, as designated by the company, with national data protection authorities (“DPAs”), or with the Federal Trade Commission (“FTC”). Additional referral options are built into Privacy Shield so as to ensure compliance with the privacy principles. Complaints must be resolved by companies within 45 days. If a case is not resolved, an arbitration mechanism, the “Privacy Shield Panel,” may be convened to guarantee an enforceable remedy.
  • Annual joint review mechanism. To ensure U.S. accountability to commitments with respect to public authority access to personal data, the Commission and Commerce will carry out an annual review that will involve, when appropriate, DPAs, U.S. national security authorities, and the independent Ombudsperson. The results of this review will be incorporated into a public report issued to the European Parliament and the Council.
Like Safe Harbor, under Privacy Shield, companies must self-certify adherence to a set of privacy principles to Commerce. These include: (1) the Notice Principle; (2) the Security Principle; (3) the Accountability for Onward Transfer Principle; (4) the Security Principle; (5) the Data Integrity and Purpose Limitation Principle; (6) the Access Principle; and (7) the Recourse, Enforcement, and Liability Principle. Privacy Shield also includes a detailed set of supplemental principles.1

The chart below highlights some key differences between Privacy Shield and Safe Harbor:
Privacy Shield Safe Harbor
1. Self-Certification: Annual re-certification of compliance to Commerce is required. 1. Self- Certification: Annual self-certification letters reaffirming commitment to the Framework is required.
2. Removal from Privacy Shield List: Upon removal from the Privacy Shield list, organizations must (1) return or delete EU personal data collected under the Privacy Shield or (2) have ongoing obligations to apply Privacy Principles to personal information and affirm this commitment to Commerce. 2. Removal from Safe Harbor List: Upon removal from the Safe Harbor list, organizations must promptly delete any data collected under the Safe Harbor.
3. Privacy Principles: Notice In addition to disclosure obligations of the Safe Harbor, organizations must also inform data subjects of (1) the purpose for third parties’ disclosures of information, (2) their right to access their personal data, (3) the company’s liability for onward transfers to third parties, and several other significant rights of data subjects. 3. Privacy Principles: Notice Organization must inform data subjects about the purpose for data collection and use, how to contact the company regarding inquiries or complaints, the types of third parties the company discloses information to, and the choices and means offered to individuals to limit use and disclosure.
4. Privacy Principles: Choice The organization must offer data subjects the opportunity to opt out of whether their personal information is used for a purpose that is materially different from the purpose for which it was originally collected or subsequently authorized. 4. Privacy Principles: Choice The organization must offer data subjects the opportunity to opt out of whether their personal information is used for a purpose that is incompatible with the purpose for which it was originally collected or subsequently authorized.
5. Privacy Principles: Onward Transfer Organization must enter into a contract with third-party controller, adhere to specific onward transfer requirements, and has a rebuttable presumption of liability for violations of sub-processing obligations by the third-party controller. 5. Privacy Principles: Onward Transfer Organization must have contract with third-party onward transfer recipient or ascertain that the third party subscribes to the Principles or is subject to Directive or another adequacy finding. If the company complies with these requirements it will not be held responsible for violations by a third-party onward transfer recipient.
6. Redress Possibilities: Data subjects have multiple options to lodge complaints. They can bring a complaint directly with the organization, with a free-of-charge independent dispute resolution body, as designated by the company, with national DPAs or with the FTC. 6. Redress Possibilities: Data subjects are encouraged to raise complaints with the relevant organization before proceeding to an independent recourse mechanism, which must be readily available and affordable.

Timeline for Full Implementation
The Privacy Shield decision enters into force immediately upon notification to EU Member States. To that end, Commerce has issued a guide to self-certification and will begin accepting self-certifications to the Privacy Shield on August 1. The guide highlights five steps for companies wishing to come into compliance and certify under the Privacy Shield:
  1. Confirm Your Eligibility: Companies subject to Federal Trade Commission or Department of Transportation jurisdiction are eligible to participate in Privacy Shield.
  2. Develop a Compliant Privacy Policy Statement: Develop a privacy policy that (1) conforms to Privacy Shield principles, (2) references your Privacy Shield compliance, (3) identifies your independent recourse mechanism, and (4) is publicly available. Companies should also provide accurate information about the location of the privacy policy when self-certifying.
  3. Identify Your Independent Recourse Mechanism: Identify an independent recourse mechanism available to investigate unresolved complaints at no cost to the individual.
  4. Establish and Put in Place a Verification Mechanism: Use a self-assessment or an outside/third-party assessment program to verify compliance consistent with Privacy Shield’s verification requirement.
  5. Designate a Privacy Shield Contact: Designate a contact for the handling of questions, complaints, access requests, and any other issues arising under Privacy Shield.
What Now?
While the European Commission and the U.S. government have expressed optimism regarding the legal certainty for businesses under Privacy Shield, Privacy Shield may very well be challenged in EU courts or require renegotiation once the General Data Protection Regulation comes into force in May 2018.2

The EU privacy watchdogs that make up the Article 29 Working Party (the “Working Party”) expressed concerns with an earlier draft of Privacy Shield. Criticizing Privacy Shield’s overall lack of clarity and accessibility, the Working Party noted that U.S. representation did not rule out bulk U.S. government data collection,3 and the lack of detail regarding the newly-established Ombudsperson. Although the Commission and the U.S. agreed on additional clarifications on bulk collection of data, and strengthened the role of the Ombudsperson, it is unclear whether these and other changes are sufficient to protect the rights of EU citizens.

Additionally, privacy advocates have raised concerns that Privacy Shield suffers from the same underlying issues as the now-defunct Safe Harbor. A prompt legal challenge is likely to determine whether Privacy Shield can withstand the scrutiny set forth in the ECJ’s decision in Schrems.

What is certain is that the FTC and Commerce will play an active role in enforcement of data transfers under this framework. FTC Chairwoman Edith Ramirez stated that the FTC would “continue to work closely with our European counterparts to provide robust privacy and data security protections for consumers in the United States and Europe.”

In light of this, the best course of action for companies, particularly those that have yet to enter into model contracts, is to evaluate Privacy Shield and their data transfer practices to see whether self-certification to the Privacy Shield makes sense for the company.

Kelley Drye’s Privacy & Information Security practice group is well-versed in privacy law at the federal and state level, and stand ready to help interested parties understand the potential scope of these rules and how to get involved in the proceeding. Should you have any questions, please contact any of the attorneys listed in the margin. For more information about this and related issues, please contact one of the attorneys below.

Dana Rosenfeld (202) 342-8588 [email protected]

Alysa Hutnik (202) 342-8603 [email protected]


[1] (1) Sensitive Data; (2) Journalistic Exceptions; (3) Secondary Liability; (4) Performing Due Diligence and Conducting Audits; (5) Role of Data Protection Authorities; (6) Self-Certification; (7) Verification; (8) Access; (9) Human Resources Data; (10) Obligatory Contracts for Onward Transfers; (11) Dispute Resolution and Enforcement; (12) Choice – Timing of Opt Out; (13) Travel Information; (14) Pharmaceutical and Medical Products; (15) Public Record and Publicly Available Information; and (16) Access Requests by Public Authorities.

[2] The General Data Protection Regulation will allow for a periodic review of prior adequacy decisions.

[3] For example, the Presidential Policy Directive 28, which provides principles for intelligence collection, is not a legislative act and therefore cannot create actionable rights for individuals in a court of law.

]]>
Metropolitan Corporate Counsel Interviews Special Counsel Richard Cohen Regarding Innovations in Technology Affecting Big Data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/metropolitan-corporate-counsel-interviews-special-counsel-richard-cohen-regarding-innovations-in-technology-affecting-big-data https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/metropolitan-corporate-counsel-interviews-special-counsel-richard-cohen-regarding-innovations-in-technology-affecting-big-data Wed, 09 Dec 2015 09:42:07 -0500 Special counsel Richard Cohen was interviewed by Metropolitan Corporate Counsel in the law article “Weighing Public vs. Private Interests in the Big Data Economy: Innovations in technology continue to bring more questions about privacy.” Mr. Cohen discusses the current startup environment, big data, and venture capital accelerators based on his extensive experience working with technology companies on transactions, outsourcing agreements, strategic alliances, software licensing and development, cloud services, and other commercial and corporate matters. He says, “[for many organizations] a growing area of interest is the need to balance the use of big data for public good while respecting privacy and individual rights.”

To read the full article, please click here.

]]>