As workforces become increasingly mobile and remote work is more the norm, employers face the challenge of balancing the protection of their employees’ personal data and privacy against the need to collect and process personal data to recruit, support and monitor their workforces. Mounting regulations attempt to curb employers’ ability to gather and utilize employee data—from its historical use in processing employee benefits and leave requests to employers’ collection, use or retention of employees’ biometric data to ensure the security of the organization’s financial or other sensitive information systems. Learn what employers can do now to protect employee data and prepare for the growing wave of data privacy laws impacting the collection and use of employee personal data.
Avoiding Price Gouging Claims Wednesday, August 3 Recently State Attorneys General, the House Judiciary Committee, and many others have weighed in on rising prices in an attempt to weed out price gouging and other forms of what they deem “corporate profiteering.” States and federal regulators are carefully looking at pricing as consumers and constituents become more sensitive to the latest changes and price gouging enforcement is an avenue states may be able to use to appease the public. Unlike other emergencies in the past, the current state of supply chain and labor shortages, along with skyrocketing costs for businesses, make it unrealistic for companies to simply put a freeze on any price increases. This webinar will cover:
• The basics of price gouging laws and related state emergency declarations and how to comply • The differences and varied complexities in state laws • General best practice tips • How AGs prioritize enforcement
* * * *
Find more upcoming sessions, links to replays and more here
]]>Watch a video version here or the audio version here.
Shoshana Gillers has served as TransUnion’s Chief Privacy Officer since September 2019. In this role Ms. Gillers oversees compliance with privacy laws across TransUnion’s global footprint and promotes a culture of responsible data stewardship.
Prior to joining TransUnion, Ms. Gillers spent four years at JPMorgan Chase, ultimately serving as Vice President and Assistant General Counsel, Responsible Banking, Data and Privacy. Previously, she served as a federal prosecutor for eight years at the U.S. Attorney’s Office in Chicago, and as a litigator for four years at WilmerHale in New York. Ms. Gillers clerked for the Hon. Robert D. Sack on the U.S. Court of Appeals for the Second Circuit and for the Hon. Aharon Barak on the Supreme Court of Israel.
Ms. Gillers received a B.A. from Columbia University, summa cum laude, and a J.D. from Yale Law School.
Alysa chairs Kelley Drye’s Privacy and Information Security practice and delivers comprehensive expertise in all areas of privacy, data security and advertising law. Her experience ranges from strategic consumer protection oriented due diligence and compliance counseling to defending clients in FTC and state attorneys general investigations and competitor disputes.
Prior to joining the firm, Alysa was a federal clerk for the Honorable Joseph R. Goodwin, United States District Judge, Southern District of West Virginia.
Alysa received a B.A. from Haverford College, and a J.D. from the University of Maryland Carey School of Law.
]]>How does the measure stack up against the VCDPA and the CCPA (as amended by CPRA)? The good news is that, in broad terms, ColoPA generally does not impose significant new requirements that aren’t addressed under the CCPA or VCDPA, but there are a few distinctions to note..
ColoPA | VCDPA | CCPA | |
Thresholds to Applicability | Conduct business in CO or produce products or services targeted to CO and (a) control or process personal data of at least 100,000 consumers; or (b) derive revenue or receive a discount on the price of goods or service from selling personal data or controls personal data of at least 25,000 consumers | Conduct business in or produce products or services targeted to VA and (a) control or process personal data of at least 100,000 consumers; or (b) derive over 50% of gross revenue from the sale of personal data and process or control personal data of at least 25,000 consumers | Conduct business in CA and collect personal information of CA residents and: (a) has $25 million or more in annual revenue for preceding calendar year as of Jan. 1 of calendar year; (b) annually buys, sells, or shares personal data of more than 100,000 consumers or households; or (c) earns more than 50% of its annual revenue from selling or sharing consumer personal information |
Consent | Requires opt-in consent for processing sensitive personal data, including children’s data, and certain secondary processing | Requires opt-in consent for processing sensitive personal data, and COPPA-compliant consent for processing children’s data | Requires opt-in consent for sharing PI for cross-context behavioral advertising for children under 16, including parental consent for children under 13 |
Opt-Out | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for targeted advertising, sales, and profiling for legal or similarly significant effects | Required for profiling, cross-contextual advertising, and sale; right to limit use and disclosure of sensitive personal information |
Other Consumer Rights | Access, Deletion, Correction, Portability | Access, Deletion, Correction, Portability | Access, Deletion, Correction, Portability |
Authorized Agents | Permitted for opt-out requests | N/A | Permitted for all requests |
Appeals | Must create process for consumers to appeal refusal to act on consumer rights | Must create process for consumers to appeal refusal to act on consumer rights | N/A |
Private Cause of Action | No | No | Yes, related to security breaches |
Cure Period? | 60 days until provision expires on Jan. 1, 2025 | 30 days | No |
Data Protection Assessments | Required for targeted advertising, sale, sensitive data, certain profiling | Required for targeted advertising, sale, sensitive data, certain profiling | Annual cybersecurity audit and risk assessment requirements to be determined through regulations |
Given the significant overlap among the three privacy laws, companies subject to ColoPA should be able to leverage VCDPA and CCPA implementation efforts for ColoPA compliance. If ColoPA is any example, other state privacy efforts may not veer too far from the paths VCDPA and CCPA have forged. The key will be to closely monitor how CalPPA and the Colorado Attorney General address forthcoming regulations and whether they add new distinct approaches for each state. Check back on our blog for more privacy law updates.
]]>The California Assembly recently passed AB-1262 updating an existing law to further limit the use of personal information collected through connected TVs and smart speaker devices. Specifically, the bill prohibits:
California consumers would not have a private cause of action, but the state AG or district attorney would have the authority to seek penalties of up to $2,500 per connected TV or smart device sold or leased that violates the law. After passing the Assembly, the bill will now need to pass in the Senate and be signed by the Governor to become law. The bill was recently referred to the Senate’s Committee on Judiciary for review.
This bill signals the state’s continued focus on consumer privacy, as legislators continue to consider the privacy implications of smart devices and other technology, and the variety of ways in which companies use data for advertising purposes. We will continue to monitor the legislation and provide any updates.
* * *
Subscribe here to Kelley Drye’s Ad Law Access blog and here for our Ad Law News and Views newsletter. Visit the Advertising and Privacy Law Resource Center for update information on key legal topics relevant to advertising and marketing, privacy, data security, and consumer product safety and labeling.Kelley Drye attorneys and industry experts provide timely insights on legal and regulatory issues that impact your business. Our thought leaders keep you updated through advisories and articles, blogs, newsletters, podcasts and resource centers. Sign up here to receive our email communications tailored to your interests.
]]>The bill imposes specific requirements on entities seeking to process precise geolocation data, proximity data, persistent identifiers, and personal health information (together, “covered data”) in association with COVID-19 mitigation efforts. Among other things, the Act would require:
Separately, the bill explicitly exempts covered entities from requirements under the Communications Act or regulations in relation to this processing. The bill also preempts any similar state law, although state attorneys general have enforcement authority along with the FTC.
Whether Congress will pass the measure is unclear, as Democrats and public interest organizations have voiced concerns about the bill. Still, assuming Congress can agree, it’s worth monitoring to see whether the measure may be included in any upcoming COVID-19 relief bill.
]]>Personal information — some of which may be highly sensitive — is key to many of these efforts. Although some regulators in the U.S. and abroad have made it clear that privacy laws and the exercise of enforcement discretion provide leeway to process personal information in connection with COVID-19, they have also made it clear that privacy laws continue to apply. Federal Trade Commission (FTC) Chairman Joe Simons advises that the FTC will take companies’ “good faith efforts” to provide needed goods and services into account in its enforcement decisions but will not tolerate “deceiving consumers, using tactics that violate well-established consumer protections, or taking unfair advantage of these uniquely challenging times.” And, with many eyes on the California Attorney General’s Office in light of recent requests to delay enforcement of the California Consumer Privacy Act (CCPA), an advisor to Attorney General Xavier Becerra was quoted as stating: “We’re all mindful of the new reality created by COVID-19 and the heightened value of protecting consumers' privacy online that comes with it. We encourage businesses to be particularly mindful of data security in this time of emergency.”
Devoting some thought to privacy issues at the front end of COVID-19 projects will help to provide appropriate protections for individuals and address complications that could arise further down the road. This post identifies some of the key privacy considerations for contributors to and users of COVID-19 resources.
1. Is Personal Information Involved?
Definitions of “personal information” and “personal data” under privacy laws such as the CCPA and the EU’s General Data Protection Regulation (GDPR) are broad. Under the CCPA, for example, any information that is “reasonably capable of being associate with, or could reasonably be linked” with an individual, device, or household is “personal information.” This definition specifically includes “geolocation data.” Although some data sources provide COVID-19-related information at coarse levels of granularity, e.g., county, state, or national level, the broad definition of “personal information” under the CCPA, GDPR, and other privacy laws makes it worth taking a close look at geographic and other types of information to determine whether the data at issue in fact reasonably qualifies as “personal information,” or if it is sufficiently anonymized to meet privacy definitions of de-identified and/or aggregate data. CCPA, HIPAA, and other privacy laws provide examples of what safeguards are expected to reasonably treat data as anonymized, and employing such standards can help avoid unnecessary privacy mishaps despite well-intentioned efforts.
2. What Level(s) of Transparency Are Appropriate About the Data Practices?
Although some COVID-19 tools may be exempt from statutory requirements to publish a privacy policy (e.g., the provider of the tool is not a “business” under the CCPA), there are still reasons for providers to explain what data they collect and how they plan to use and disclose the data:
Although much remains to be seen in how federal, state, and local governments will use personal information (if at all) to develop and implement strategies to slow the spread of coronavirus, it is not unreasonable to expect that government agencies will seek information from providers of COVID-19-related tools. The extent to which a provider can voluntarily provide information to the government — as well as the procedures that the government must follow to compel the production of information (and maintain the confidentiality of it in personally identifiable form) — depends on several factors, including what kind of information is at issue and how it was collected. Becoming familiar with the rules that apply to voluntary and compelled disclosures, and safeguards to help prevent such data from being subject to broad freedom of information laws, before a request arrives can help save valuable time down the road. In many of these scenarios, for example, aggregate or pseudonymous data may be sufficient.
4. What Considerations Are There for Licensing COVID-19-Related Personal Information?
Finally, any licensing of personal information in connection with COVID-19 tools deserves careful consideration, particularly if the CCPA applies. The CCPA imposes notice and opt-out requirements on entities that “sell” personal information. “Sell” is defined to include disseminating, disclosing, or otherwise “making available” personal information to for-profit third parties in exchange for “monetary or other valuable consideration.” Several types of open source licenses require users to accept certain restrictions on their use and/or redistribution of licensed data or software. For example, the Creative Commons Attribution-NonCommercial 4.0 International license requires licensees to agree (among other conditions) not to use licensed content for commercial purposes. Obtaining this promise in exchange for personal information could constitute “valuable consideration” and give rise to a “sale” under the CCPA. In addition, while not a “sale,” sharing personal information with a government authority would qualify as a disclosure under CCPA and would need to be accurately disclosed in the privacy policy.
Neither the California Attorney General nor the courts have interpreted the CCPA in the context of open source licenses. Until more authoritative guidance becomes available, it makes sense to think through the potential obligations and other consequences of applying and accepting specific license terms to COVID-19-related personal information.
Bottom line: Personal information has a key role to play in shaping responses to the novel coronavirus. Privacy laws remain applicable to this information. Applying privacy considerations to COVID-19 related practices involving data collection, sharing, and analysis will help mitigate unnecessary harms to consumers, aside from those presented by the virus itself.
For other helpful information during this pandemic, visit our COVID-19 Resource Center.
]]>CCPA’s Limited Private Right of Action
The Attorney General’s Office was granted wide discretion and enforcement powers to impose fines of up to $2,500 for unintentional violations and up to $7,500 for each intentional violation. Cal. Civ. Code 1798.155. The CCPA, however, provides for only limited private right of action for individual consumers related to data security breaches. Cal. Civ. Code 1798.150. Plaintiffs can recover actual damages or statutory damages of $100 to $750. A broader potential private right of action was considered and would have permitted individuals to sue for any and all CCPA violations. SB 561. But that amendment failed to pass in May.
Where There’s a Will, There’s a Way?
But anyone expecting that companies will only face privacy-related consumer litigation in the context of a data breach is under-selling the risk. While direct actions under the CCPA may be limited, the requirements of the CCPA may serve as the basis for claims under other consumer protection statutes. And, importantly, the public statements and policies that companies issue will be scrutinized not just for their actual compliance, but for whether companies are fulfilling their own promises. Indeed, nothing prevents individuals from filing putative consumer class action claims alleging false statements, unfair business practices, or misleading conduct on behalf of companies in connection with their privacy policies and practices.
What Types of Claims Are Likely to be Filed?
These claims are likely to be brought pursuant to other California consumer protection statutes, such as California’s Unfair Competition Law (Bus. & Prof. Code 17200), False Advertising Law (Bus. & Prof. Code 17500), and Consumer Legal Remedies Act (Civ. Code 1750). For example:
Precision in Privacy Promises
These risks are a good reminder that it is critical not just to have the CCPA required disclosures in privacy statements and communications in response to consumer rights requests, but also to be vigilant and precise about the descriptions of privacy practices and how the company is honoring the rights requests. In the end, a company’s statements about its CCPA compliance could end up triggering potential exposure far greater than anything available under the CCPA itself.
]]>The Senate Appropriations Committee also voted to advance AB 1202, the data broker amendment, but placed the bill in the Committee’s suspense file. This procedural action holds bills that will have a significant fiscal impact on the State of California’s budget for consideration all at once to ensure that fiscal impacts are considered as a whole.
Here’s the full list of amendments as of August 12, 2019:
Ordered to Second Reading in the California Senate
The bills now head to the Committee on Appropriations for a vote next month, followed by a vote of the full Senate. The legislature has until September 13, 2019 to pass bills.
The most consequential and anticipated of the amendments, A.B. 25, A.B. 846, A.B. 1202, A.B. 1564, and A.B. 1146, were changed by the Judiciary Committee. That means they will require the consent of the California Assembly before they can head to the governor’s desk for a final signature.
By comparison, two technical amendments, A.B. 874 and A.B. 1355, were advanced without changes by the Judiciary Committee. If approved in their current form by the full Senate, these amendments will move directly to the governor for a signature.
Three CCPA amendments failed to secure approval by the Judiciary Committee and are unlikely to further advance. These are A.B. 1416, A.B. 873, and A.B. 981.
Here’s the full list of CCPA amendments:
Amended and Approved by the Judiciary Committee
Approved by the Judiciary Committee
The Protecting Personal Health Data Act, S.1842, introduced by Senators Amy Klobuchar (D-Minn.) and Lisa Murkowski (R-Alaska), seeks to close a growing divide between data covered by the Health Insurance Portability and Accountability Act (HIPAA) and non-covered, sensitive personal health data.
More specifically, the bill would regulate consumer devices, services, applications, and software marketed to consumers that collect or use personal health data. This would include genetic testing services, fitness trackers, and social media sites where consumers share health conditions and experiences. Often, these technologies and services are run independent from traditional, HIPAA healthcare operations involving hospitals, healthcare providers, and insurance companies.
The bill directs the U.S. Department of Health and Human Services (HHS) to promulgate rules that would strengthen the privacy and security of such personal health data. The bill contemplates that the new rule would:
Inevitably, the success or failure of the legislation will be tied to federal baseline privacy legislation already pending in Congress. Those efforts are ongoing, but have lost momentum in recent months as focus turns to California’s new privacy law taking effect on January 1, 2020.
]]>• Nevada
Nevada’s SB 220 amends the state’s existing online privacy notice statute, NRS 603A.300 to .360, to add a provision that requires “operators” – which include most companies that conduct business online with Nevada residents – to comply with a consumer’s do-not-sell request (health care and financial institutions subject to HIPAA and GLBA are out of scope of the law). As of the October 1, 2019 effective date, operators are required to create a “designated request address,” such as an email address, toll-free number, or website, through which consumers can submit a “verified request” to restrict the sale of covered data. A “verified request” is one where the operator can reasonably verify the authenticity of the request and the consumer’s identity using “commercially reasonable means,” which the law does not define.The personal information covered under the law includes personal data such as name, address, and SSN, as well as online contact information, and any other data collected by the company that could be viewed as personally identifiable. Notably, the law defines “sale” more narrowly than the CCPA to include the exchange of covered information for “monetary consideration” to a person “for the person to license or sell the covered information to additional persons.”
Operators will have 60 days to respond to a consumer’s do-not-sell request, though this timeline may be extended by up to 30 days where the operator deems it necessary and notifies the consumer. The provision will be enforced by the Nevada Attorney General’s Office, which can impose a penalty of up to $5,000 per violation.
• Maine
The bill advanced by the Maine Legislature, titled “an Act to Protect the Privacy of Online Customer Information,” would among other things prohibit ISPs’ use, disclosure, and sale of “customer personal information” without a customer’s opt-in consent, except under limited circumstances such as to provide the requested service, to collect payment, and several other narrow scenarios. Customer personal information subject to the law broadly would include (1) personally identifiable information about an ISP customer; and (2) information relating to a customer’s use of broadband Internet access service, including web browsing history, app usage, device identifiers, geolocation data, and other usage information. ISPs also would be prohibited from making the sale of data mandatory under the applicable terms of service, or refusing service to customers who do not consent to data collection.The bill is an attempt to restore at the state level core provisions within the FCC’s 2016 broadband order that were repealed by Congress in 2017. The Maine State Chamber of Commerce has opposed the bill, claiming that ISPs are being unfairly singled out, and arguing that the law would result in a false sense of privacy for consumers given that large web-based companies such Facebook and Google would not be subject to the law. The Governor still must sign the final legislation, which would take effect July 1, 2020.
]]>The Draft Policy focuses on: (i) restriction on cross-border flow of data; (ii) local presence and taxability of foreign entities having significant economic presence in India; (iii) creating a robust digital infrastructure for e-commerce, from online custom clearance to online resolution of consumer complaints; (iv) promoting exports from India with a boost to start-ups and small firms; and (v) regulatory changes to augment economic growth in e-commerce.
The key highlights of the Draft Policy are as follows:
1. Local Control over Data: The Draft Policy identifies data as an asset and provides that Indian citizens and companies should get economic benefits against monetization of India’s data. The Draft Policy has suggested the following measures in this regard:
(i) To create a legal and technological framework for cross-border flow of data as well as domestic data flow for public policy, for (a) data collected by public-installed ‘Internet of Things’ (IOT) devices; and (b) data of users generated over e-commerce and social media platforms.
(ii) It prohibits disclosure of sensitive data stored by business entities outside India, with any other business entity or third party despite consumer consent. The non-compliance of these obligations may lead to consequences to be prescribed by the Government.
(iii) It has exempted certain types of cross-border data flow from restrictions: (a) data not collected in India; (ii) B2B data transfers outside India under a commercial contract; (c) software and cloud computing services which have no personal or community impact; and (d) multinational corporations moving internal data which is not collected from social media, e-commerce platforms or search engines, etc.
(iv) The Draft Policy also suggests community sharing of data with start-ups for larger public interest, with the help of a ‘Data Authority,’ which is to be established.
2. Digital Infrastructure: The Draft Policy aims at making India data-storage ready by providing infrastructure guidelines for local data centers, server farms, etc. and shift to a digital regime for services including government services. The Draft Policy has provided for a sunset period of three years to the industry to adjust to the data storage requirements and infrastructure set-up. It also suggests creating domestic alternatives to foreign cloud based and email services. However, there is no blanket or express ban on overseas data storage, other than as discussed above.3. Foreign Investment (FDI) in E-Commerce Sector: The Draft Policy is in line with the present FDI which permits FDI only in market place models and not inventory-based models.
4. Mandatory Registration in India: Amongst other strict anti-counterfeit and anti-piracy measures suggested by the Draft Policy, it requires e-commerce sites/apps available for download in India, to be registered in India as the importer on record or as the entity through which all sales in India are transacted. A local representative is also required to be appointed by entities.
5. Other Suggested Regulatory Changes: Other regulatory changes suggested by the Draft Policy include: (i) reduction in advertising charges in e-commerce for start-ups and small enterprises; (ii) Government may reserve its right for disclosure of source code and algorithms especially in AI technology platforms; (iii) special status to be afforded to start-ups and small enterprises vis-à-vis data access; (iv) suggested amendment to the Indian Income Tax Act to include ‘significant economic presence’ as a means to determine permanent establishment in India and allocation of profits of multinationals; (v) levying customs duties on electronic transmissions; (vi) addressing digital payment and other financial issues including privacy of information, inherent in e-commerce sector; (vii) increased responsibility and liability of e-commerce and social media platform as to genuineness of information posted on their sites; and (viii) establishment of e- consumer courts to address grievances/ disputes online.
****
This post was written by Rahul Mahajan and Anushka Jasuja from our affiliate office in Mumbai. Learn more about our India practice here.
]]>The complaint against Taiwan-based networking equipment manufacturer D-Link Corporation and its U.S. subsidiary D-Link Systems alleges that the companies failed to take reasonable steps to protect the internet routers and IP cameras from “widely known and reasonable foreseeable” vulnerabilities. According to the complaint, these risks were not purely theoretical: D-Link equipment has been compromised by attackers, including being made part of “botnets,” which are large-scale networks of computers infected by malicious software.
In particular, the complaint alleges that the company failed to take steps to address well-known and easily preventable security flaws, such as:
But the FTC is not only concerned with the potential vulnerabilities of the D-Link products; in Counts II through VI, the FTC alleges that D-Link violated Section 5(a) of the FTC Act by making deceptive statements about the products’ security. These allegedly deceptive statements include the following:
Count II: D-Link advertised a Security Event Response Policy, implying that D-Link had taken reasonable measures to secure the products from unauthorized access;
Count III: In promotional materials, D-Link claimed that its routers were “EASY TO SECURE” and had “ADVANCED NETWORK SECURITY,” among other claims, implying that the routers were secure from unauthorized access and control;
Count IV: In promotional materials, D-Link advertised that its cameras provided a “secure connection,” among other claims, implying that the cameras were secure from unauthorized access and control;
Count V: To begin using the routers, a graphical user interface provided security-related prompts such as “To secure your new networking device, please set and verify a password below,” implying that the routers were secure from unauthorized access and control; and
Count VI: To begin using the cameras, a graphical user interface provided security-related prompts such as “Set up an Admin ID and Password” or “enter a password” in order “to secure your camera” and featured a lock logo, implying that the cameras were secure from unauthorized access and control.
In a press release announcing the lawsuit, FTC Bureau of Consumer Protection Director Jessica Rich commented, “When manufacturers tell consumers that their equipment is secure, it’s critical that they take the necessary steps to make sure that’s true.”
The Commission vote authorizing the staff to file the complaint was 2-1, with Commissioner Maureen K. Ohlhausen voting against the complaint. The complaint was filed in the U.S. District Court for the Northern District of California.
The complaint is just the most recent action in the FTC’s efforts to crack down on potential vulnerabilities in the Internet of Things (IoT). The FTC has also brought enforcement actions against ASUS over allegedly insecure routers and cloud services and against TRENDnet over its allegedly insecure cameras. This case serves as yet another reminder that the FTC remains focused on cyber security, especially for IoT devices, and that it is important for all businesses that handle or have access to customer information to ensure that they have implemented reasonable security practices, and confirmed the accuracy of all related marketing claims and public representations (including in public-facing policies and product dashboards) about the security of their products.
]]>* * *
The Broadband Privacy Order is an important and controversial decision. Commissioner Rosenworcel touted the rules as “real privacy control for consumers.” Commissioner Clyburn similarly praised the benefits of the rules for consumer protection, but openly acknowledged that “with respect to the future of privacy, I think we still have work to do” and saw the need for further harmonization efforts vis-a-vis the FTC. Commissioners Pai and O’Rielly both voiced strong dissents. Commissioner Pai emphasized that these rules were out of sync with FTC standards, warned that “[n]othing in these rules will stop edge providers from harvesting and monetizing your data,” and expressed concern that the order sets forth “one-sided rules that will cement edge providers’ dominance in the online advertising market.” Commissioner O’Rielly expressed frustration with the order’s new opt-in requirements, stating that the use of an “opt-in consent mechanism results in far fewer individuals conveying their consent than is the case under an opt-out consent mechanism even when substantial benefits are at stake.” FTC Chairwoman Edith Ramirez released a statement praising the order: “I am pleased that the Federal Communications Commission has adopted rules that will protect the privacy of millions of broadband users. The rules will provide robust privacy protections, including protecting sensitive information such as consumers’ social security numbers, precise geolocation data, and content of communications, and requiring reasonable data security practices. We look forward to continuing to work with the FCC to protect the privacy of American consumers.” Although the full order has yet to be released, at a press conference following the meeting, Chairman Wheeler indicated there was a relatively strong chance it will be released at some point in the next 24 to 48 hours. Details aside, it is clear that today’s decision (if upheld) will change the communications and privacy landscape. We will post updates here as we learn more about the new Broadband Privacy rules. Finally, Kelley Drye will soon offer a free webinar that will unpack the new order. Once the full order becomes available, we will announce the date and time of the webinar. ]]>The more than 4,400 U.S. companies that previously relied on the Safe Harbor and have been waiting for an alternative mechanism for data transfers can choose to self-certify to the Department of Commerce (“Commerce”) under the new Privacy Shield framework. Commerce will begin accepting Privacy Shield applications on August 1, 2016. This client advisory provides an overview of Privacy Shield, highlights key differences between Privacy Shield and Safe Harbor, and offers some key considerations given the forthcoming Global Data Protection Regulation and other data privacy developments.
The chart below highlights some key differences between Privacy Shield and Safe Harbor:
Privacy Shield | Safe Harbor |
1. Self-Certification: Annual re-certification of compliance to Commerce is required. | 1. Self- Certification: Annual self-certification letters reaffirming commitment to the Framework is required. |
2. Removal from Privacy Shield List: Upon removal from the Privacy Shield list, organizations must (1) return or delete EU personal data collected under the Privacy Shield or (2) have ongoing obligations to apply Privacy Principles to personal information and affirm this commitment to Commerce. | 2. Removal from Safe Harbor List: Upon removal from the Safe Harbor list, organizations must promptly delete any data collected under the Safe Harbor. |
3. Privacy Principles: Notice In addition to disclosure obligations of the Safe Harbor, organizations must also inform data subjects of (1) the purpose for third parties’ disclosures of information, (2) their right to access their personal data, (3) the company’s liability for onward transfers to third parties, and several other significant rights of data subjects. | 3. Privacy Principles: Notice Organization must inform data subjects about the purpose for data collection and use, how to contact the company regarding inquiries or complaints, the types of third parties the company discloses information to, and the choices and means offered to individuals to limit use and disclosure. |
4. Privacy Principles: Choice The organization must offer data subjects the opportunity to opt out of whether their personal information is used for a purpose that is materially different from the purpose for which it was originally collected or subsequently authorized. | 4. Privacy Principles: Choice The organization must offer data subjects the opportunity to opt out of whether their personal information is used for a purpose that is incompatible with the purpose for which it was originally collected or subsequently authorized. |
5. Privacy Principles: Onward Transfer Organization must enter into a contract with third-party controller, adhere to specific onward transfer requirements, and has a rebuttable presumption of liability for violations of sub-processing obligations by the third-party controller. | 5. Privacy Principles: Onward Transfer Organization must have contract with third-party onward transfer recipient or ascertain that the third party subscribes to the Principles or is subject to Directive or another adequacy finding. If the company complies with these requirements it will not be held responsible for violations by a third-party onward transfer recipient. |
6. Redress Possibilities: Data subjects have multiple options to lodge complaints. They can bring a complaint directly with the organization, with a free-of-charge independent dispute resolution body, as designated by the company, with national DPAs or with the FTC. | 6. Redress Possibilities: Data subjects are encouraged to raise complaints with the relevant organization before proceeding to an independent recourse mechanism, which must be readily available and affordable. |
The EU privacy watchdogs that make up the Article 29 Working Party (the “Working Party”) expressed concerns with an earlier draft of Privacy Shield. Criticizing Privacy Shield’s overall lack of clarity and accessibility, the Working Party noted that U.S. representation did not rule out bulk U.S. government data collection,3 and the lack of detail regarding the newly-established Ombudsperson. Although the Commission and the U.S. agreed on additional clarifications on bulk collection of data, and strengthened the role of the Ombudsperson, it is unclear whether these and other changes are sufficient to protect the rights of EU citizens.
Additionally, privacy advocates have raised concerns that Privacy Shield suffers from the same underlying issues as the now-defunct Safe Harbor. A prompt legal challenge is likely to determine whether Privacy Shield can withstand the scrutiny set forth in the ECJ’s decision in Schrems.
What is certain is that the FTC and Commerce will play an active role in enforcement of data transfers under this framework. FTC Chairwoman Edith Ramirez stated that the FTC would “continue to work closely with our European counterparts to provide robust privacy and data security protections for consumers in the United States and Europe.”
In light of this, the best course of action for companies, particularly those that have yet to enter into model contracts, is to evaluate Privacy Shield and their data transfer practices to see whether self-certification to the Privacy Shield makes sense for the company.
Kelley Drye’s Privacy & Information Security practice group is well-versed in privacy law at the federal and state level, and stand ready to help interested parties understand the potential scope of these rules and how to get involved in the proceeding. Should you have any questions, please contact any of the attorneys listed in the margin. For more information about this and related issues, please contact one of the attorneys below.
Dana Rosenfeld (202) 342-8588 [email protected]
Alysa Hutnik (202) 342-8603 [email protected]
[1] (1) Sensitive Data; (2) Journalistic Exceptions; (3) Secondary Liability; (4) Performing Due Diligence and Conducting Audits; (5) Role of Data Protection Authorities; (6) Self-Certification; (7) Verification; (8) Access; (9) Human Resources Data; (10) Obligatory Contracts for Onward Transfers; (11) Dispute Resolution and Enforcement; (12) Choice – Timing of Opt Out; (13) Travel Information; (14) Pharmaceutical and Medical Products; (15) Public Record and Publicly Available Information; and (16) Access Requests by Public Authorities.
[2] The General Data Protection Regulation will allow for a periodic review of prior adequacy decisions.
To read the full article, please click here.
]]>