Insights From the FTC’s Big Data” Workshop

Kelley Drye Client Advisory

On September 15, 2014, the Federal Trade Commission (“FTC”) hosted a public workshop entitled Big Data: A Tool for Inclusion or Exclusion?” to further explore the use of big data” and its impact on consumers.

More Data Has Become Big Data”

In her opening remarks, FTC Chairwoman Edith Ramirez noted that 30% of the world’s data had been generated in the two years prior to 2013.  The proliferation of connected devices has greatly expanded the volume and types of data generated and collected, and the advances in statistical methods have allowed big data to be examined to identify correlations, make predictions, and draw inferences about consumer groups.  The collection and use of big data raise new ethical and socioeconomic concerns, including:
  • Whether and how big data helps include or exclude consumers from opportunities in the marketplace, and the potentially negative consequences of the exclusion of vulnerable populations from the data sets;
  • Whether there exists an ethical obligation to protect and regulate the consumer information being collected;  and
  • How regulators, legislators, and  industry could ensure the maximization of the societal benefits of big data while identifying and minimizing the risk big data represents. 

Chairwoman Ramirez noted that the FTC would continue to ask these difficult questions.  In the absence of new legislation, the FTC will examine big data practices through the lens of existing laws under the FTC’s enforcement authority.  The FTC will also continue to encourage businesses and industry toward self-regulatory guidance to guard against bias and the disparate impact on low-income populations in designing their algorithms and predictive products.

Regulators Keep a Watchful Eye on Key Industry Players

FTC Commissioner Julie Brill focused her remarks on three key industry players in the big data space: traditional credit reporting agencies and their progeny using alternative scoring models; data brokers and the companies that use their products; and companies engaged in analysis of their own consumer data.  Commissioner Brill implored these players to identify and mitigate data uses that result in disparate treatment along racial, ethnic, gender, or other inappropriate lines.  In addition to industry cooperation, she urged the FTC and other agencies to devote serious resources” to studying the real world impact of alternative scoring models.  Commissioner Brill ended her remarks with a call for legislation to create greater transparency and accountability for data brokers, as well as their sources and their customers, to better understand how these profiles are being used and whether, and under what circumstances, they may be harming vulnerable populations.

Workshop Central Themes

The predominant issues discussed throughout the workshop were the following: (i) the extent which big data has inclusive or exclusive effects for certain disadvantaged groups; (ii) the current legal landscape; (iii) the lack of visibility of big data; (iv) the role of education, notice, and consent; and (v) the role of regulation.
A.       Inclusivity vs. Exclusivity
Panelists generally touted the socioeconomic benefits associated with big data analytics.  Mark MacCarthy, Vice President for Public Policy of the Software Information Industry Association, remarked big data could be used to promote socioeconomic equalities, for example, the use of predictive analytics in education to identify students at risk of dropping out of school and deploy early prevention tools.  Another compelling example is the use of cognitive computing in health care to function as oncology diagnoses to accommodate the shortage of specialty providers.  Mallory Duncan, Senior Vice President and General Counsel of the National Retail Federation, addressed the positive uses of big data in the marketplace.  Big data provids retailers with the unique opportunity to develop trust and loyalty with their consumers.

The panelists also touted the benefits associated with the use of alternative data scores.  According to LexisNexis, 41% of Hispanics could not be scored by traditional methods for purposes of obtaining a credit score.  The new alternative data scoring allowed 81% of those unscorable to be scored, for example, by using the frequency of consumers’ payment of their utility bill as a proxy to determine whether they would be a good loan-risk.  In an interesting rebuttal, David Robinson of Robinson + Yu cautioned the repurposing of consumer data to benefit marginal groups could come at a high social cost.  Hence, he posited the benefits of alternative data would best be realized when treading very carefully with the repurposing of consumer data.

In line with this notion of social cost, Nicol Turner-Lee, Vice President and Chief Research & Policy Officer for the Minority Media and Telecommunications Council, noted that 30 million persons were not online and therefore not contributing in this space.  Further, many consumers were unbanked or not data-ready.”  Their exclusion resulted in segmented marketing: either those persons were disadvantaged because they could not benefit from the socioeconomic benefits of big data, or the results were skewed because factors like data literacy were not being considered.

B.        Current Legal Landscape
The panelists described big data as immature.  No firm legislative definition exists; no global solutions have been implemented.  Instead, the current legal and regulatory landscape for big data is a patchwork of sectorial anti-discrimination laws, for example, in the employment, fair lending, and fair housing contexts.  In a compelling query posed to one of the panels, the panelists disagreed as to whether the current patchwork legal and regulatory scheme was sufficient to protect consumer harm.  However, they generally agreed the existing scheme provided useful insight into the regulation of big data, particularly with respect to the Equal Credit Opportunity Act (“ECOA”) and Title VII employment laws.

The panelists discussed the applicability of these sector-specific anti-discrimination laws to the advertising of big data.  For example, Leonard Chanin of Morrison Foerster discussed the limited applicability of Regulation B, by which the ECOA is administered, to marketing activities during the pre-application stage in a credit transaction.  The panelists recommended that regulators conduct further research into whether ECOA reaches the marketing of credit offers through the use of data analytics and whether such marketing would discourage consumers from obtaining credit information.

Carol Miaskoff, Assistant Legal Counsel for the Equal Employment Opportunity Commission, discussed the potential applicability of the anti-discrimination provisions of Title VII to the big data space, particularly with respect to recruitment and employment pre-screening practices.  Ms. Miaskoff emphasized one potentially critical distinction in the application of Title VII to big data: the existence of disparate impacts does not render the practice illegal under Title VII.  Rather, Title VII allows for the occurrence of disparate impacts and will only render a practice illegal if it does not accurately predict an applicant or employee’s success in the job.

C. Lee Peeler, President and CEO of the Advertising Self-Regulatory Council and Executive Vice President at the National Advertising Self-Regulation Council of Better Business Bureau, discussed the applicability of the FTC Act to big data.  He first distinguished between the practice of using data for decision-making and using data for targeted advertising.  The former practice potentially would implicate well-established prohibitions on the use of highly sensitive personal information, such as a consumer’s race or marital status.  The latter practice would fall within the purview of a deceptive and/or unfair practice under Section 5 of the FTC Act.  15 U.S.C. §§ 45 et seq.  Under the deception theory, entities using big data to narrowly target consumer groups for advertising purposes would be liable for the reasonable interpretation of that advertisement. Further, under the deception theory, data brokers would be responsible for the accuracy of the representations made about their customer databases.  Under the unfairness theory, the analysis would consider whether the ad is targeted and whether there exists access by the targeted group to alternative products.

These sector-specific, anti-discrimination laws provide the FTC and regulators with some enforcement teeth.  Panelists throughout the day expressed uncertainty about the need for new regulations to address the potential harms that may result from the use of big data.  Peter Swire, Professor of Law and Ethics at the Georgia Institute of Technology’s Scheller College of Business, remarked these laws provide useful insight for the regulation of advertising and big data.  He posited that online marketing should not be exempt from the anti-discriminatory principles embodied in the ECOA, Title VII, and numerous other laws.  In particular, fair lending laws provide decades of lessons from jurisprudence, regulatory guidance, and industry initiatives that could help guide next steps for both industry and regulators to value marketing data and ensure advertising is conducted in ways that benefit consumers.

C.       Lack of Visibility
Panelists discussed the lack of visibility of big data and data analytics.  Kristin Amerling, Chief Investigative Counsel and Director of Oversight of the U.S. Senate Committee on Commerce, Science, and Transportation, articulated findings of a recent congressional investigation into nine major data brokers.  Among the findings, Ms. Amerling noted data brokers were perpetuating the lack of transparency by contractually requiring that their customers, those companies buying and using the data, not to disclose the source of the information.

In a more conceptual vein, David Robinson articulated the central tension surrounding big data: to the extent big data is being used to surface relationships that are not intuitively obvious but are useful in the marketplace, it has become too difficult to explain to consumers and regulators the reasons behind why certain factors were ending up in these models.  Currently, there is limited or no ability to easily describe how the decisions, categorizations, and classifications are made.

Compounding the lack of an understandable” relationship between data and outcome, panelists suggested the lack of visibility may be due in large part to the myriad ethical concerns associated with the data.  Danah Boyd, Principal Researcher atMicrosoft Research, noted the current research being conducting between Microsoft and Bing that could determine whether a person would be hospitalized within the next 48 hours.  In another study by JP Morgan, analysts could predict with a high level of probability whether persons were trafficking humans for sex.  In those examples, did such companies have an obligation to notify consumers?  What would be considered an appropriate form of intervention, if any?  These are abstract, philosophical questions that underscore the socio-ethical questions surrounding the use of big data.

D.       Role of Education, Notice, and Consent
The panelists discussed the expectations of consumers in the big data space, in particular, the extent to which education, notice, and consent were effective in empowering consumer understanding and consumer choice.  This question about the role of education polarized the panelists.  A few panelists believe notice and consent are not the most effective for the more marginalized groups.  Any reliance on consumers to know when to opt-out of data collection and analytics would not be a productive way to protect them.  Rather, some panelists suggested the emphasis be on empowering the role of advocacy groups with the transparency and tools necessary to help these marginalized consumer groups.  David Robinson described these advocacy groups as holding the franchise” to the most low-income, minority groups and suggested that practices be made transparent enough to allow enough handholding by these advocates, which is a role the FTC has played well.
E.        Path Forward
The following general proposals were advanced throughout the day.
  • Define consumer harm.”  The panelists agreed that more work needs to be done to define harm,” including what constitutes inappropriate or unethical data use.  The panelists discussed whether harm should be universally defined or considered on a case-by-case sector basis.  To illustrate, the panelists stated the harms caused by predatory lending would be different than harms caused by targeted advertising.
  • Use existing legal and regulatory resources.  The panelists generally agreed new legislation seeking to implement consumer protections from big data was not the correct answer.  Rather, many panelists advanced the position that localized solutions would be sufficient to protect consumers from harms associated with big data.  For example, Nicol Turner-Lee posited a general framework like the Fair Information Practice Principles (“FIPPs”) would adequately prevent predatory behavior and possible civil rights infractions.  The question would then become, if one were to use the Fair Credit Reporting Act (“FCRA”) or FIPPs as the template, when would it be applied and how nuanced would the application be to the particular data set?

    With respect to the repurposing of existing frameworks, Pamela Dixon, Founder and Executive Director of the World Privacy Forum, advanced the application of the Common Rule.  This rule was built on the Belmont Report, which is based on the Nuremburg Code to prevent human research atrocities.  The bedrock principle of the Common Rule is human consent, which has appealed to humanity across the decades.  Ms. Dixon remarked that where violations of the Common Rule have occurred, society has viewed the resultant harm as categorically unfair.
  • Implement data-risk analysis as industry best practice.  Many panelists observed that there is tremendous opportunity for cross-sector collaboration and for companies to realize greater trust with consumers.  However, the tools to execute those objectives have not yet been perfected, and companies need clarity on what risks they need to mitigate.  Many panelists recommended that companies implement a data risk” framework similar to the notion of privacy by design.”  By way of illustration, the data benefit analysis advanced by the Future of Privacy Forum comprises two elements.  First, organizations are asked to access the raw value” of a benefit, which consists of (a) the nature of the benefit, (b) the potential beneficiaries, and (c) the degree (or size and scope) of the benefit.  Second, the organizations are asked to discount the raw value score by the probability that the benefit can be achieved to obtain a discounted data benefit” value score.
 

Conclusion 

In her closing remarks, Jessica Rich, Director of the FTC’s Bureau of Consumer Protection, emphasized that big data would be an enforcement and regulatory priority.  Key industry players should expect to see heightened regulatory scrutiny where big data practices potentially implicate laws currently enforced, such as the FCRA and Section 5 of the FTC Act.  The FTC will continue to encourage self-regulation by businesses and industry to guard against bias and the disparate impact on marginalized populations in designing their algorithms and predictive products.  Business and industry players should watch for more guidance, in the form of enforcement actions and regulatory supervision, in the near future from the FTC and other regulators in this uncertain and amorphous space.