Ad Law Access https://www.kelleydrye.com/viewpoints/blogs/ad-law-access Updates on advertising law and privacy law trends, issues, and developments Wed, 01 May 2024 23:11:37 -0400 60 hourly 1 Blurred Lines: A Rundown on the FTC Workshop “Protecting Kids from Stealth Advertising in Digital Media” https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/blurred-lines-a-rundown-on-the-ftc-workshop-protecting-kids-from-stealth-advertising-in-digital-media https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/blurred-lines-a-rundown-on-the-ftc-workshop-protecting-kids-from-stealth-advertising-in-digital-media Wed, 26 Oct 2022 07:36:48 -0400 As we recently blogged here, the FTC’s review of the COPPA rule has been pending for over three years, prompting one group of Senators, in early October, to ask the agency to “Please Update the COPPA Rule Now.” The FTC has not yet responded to that request (at least not publicly) or made any official moves towards resuming its COPPA review. However, the agency is focusing on children’s privacy and safety in other ways, including by hosting a virtual event on October 19 on “Protecting Kids from Stealth Advertising in Digital Media.”

The FTC’s day-long event examined how advertising that is “blurred” with other content online (“stealth advertising”) affects children. Among other things, the event addressed concerns that some advertising in the digital space – such as the use of influencers on social media, product placement in the metaverse, or “advergames” – can be deceptive or unfair because children don’t know that the content is an ad and/or can’t recognize the ad’s impact.

The event focused in particular on: (1) children’s capacity at different ages to recognize advertising content and distinguish it from other content; (2) harms resulting from the inability of children to recognize advertising; (3) what measures can be taken to protect children from blurred advertising content; and (4) the need for, and efficacy of, disclosures as a solution for children of different ages, including the format, timing, placement, wording, and frequency of disclosures. The FTC has also sought public comment on these topics (until November 18).

The event dove deeply into these issues, with help from a range of legal, policy, behavioral, and communications experts. (See here for the agenda and list of panelists.) The discussion was interesting and substantive, and built on actions already undertaken in Europe and California to develop Age-Appropriate Codes governing child-directed content. However, the event left open the question of whether and how the FTC intends to address the issues discussed. Will it proceed via guidance or rulemaking? If rulemaking, does it plan to use COPPA, the pending Mag-Moss rulemaking on “commercial surveillance,” or some other regulatory vehicle?

All of these options present challenges: COPPA gives parents the tools to control the content that their children see, but generally doesn’t regulate the content itself. Mag-Moss is a long process, which the FTC has made especially complex with its sprawling ANPR. Finally, any rulemaking restricting kids’ advertising could run into the specific Mag-Moss provision (discussed here) limiting the FTC’s regulatory authority in this area. (On the other hand, protecting kids’ privacy and safety tends to be a bipartisan issue, which will assist the agency as it seeks to address these issues.)

Here’s more detail on what happened at the workshop:

First up, Opening Remarks from FTC Chair Lina Khan

In her opening remarks, Chair Khan set the stage by describing how much advertising has changed over the past few decades. In the past, every child would see the same ad, but now, the digital space allows companies to treat each child as an audience of one. Also, ads now blur commercial and organic content, and kids can’t tell the difference. Chair Khan stated that the FTC is exploring whether to update its COPPA rule, while also soliciting comments on commercial surveillance more broadly, including stealth advertising to kids.

Next, CARU VP Mamie Kresses provided a “Children’s Advertising Show and Tell”

Kresses (who co-ran the COPPA program when she was at the FTC) explained that CARU (the Children’s Advertising Review Unit at BBB’s National Programs) has increased its focus on monitoring ads to children in the digital space because that’s where the majority of ads now are. Advertisers need to make sure that, when they engage in blurring, they don’t mislead kids about the nature of the commercial content. Advertisers also should avoid manipulation – i.e., making it hard for children to tell when they’re making purchases or leaning too much on their emotions.

Importantly, Kresses said, the digital space has changed with the creation of computer-generated imagery influencers. Advertisers should make clear to kids that a game avatar, for example, is part of a paid relationship. In general, advertisers must clearly disclose whether something is an ad, even in these new, creative spaces.

Panel One: Children’s Cognitive Abilities – What do they know and when?

In this discussion, panelists highlighted why protecting children in the digital space is so important. Children lack the skills to understand the persuasive effects of advertising and also tend to believe that companies have their best interest in mind. When entertainment and commercial content are blurred (e.g., when a virtual reality character gives a child something in the metaverse, or an influencer promotes a product), the child cannot tell that these are ads and takes for granted the content is all good or true. In these spaces, children develop para-social relationships and emotional attachments with content creators and influencers, which affects their ability to evaluate ads and cues. As one panelist stated, the naiveté of children should not be a tool for advertisers.

Panel Two: The Current Advertising Landscape and its Impact on Kids

This panel primarily discussed whether stealth advertising is an unfair practice under the FTC Act. Citing the elements of unfairness, some thought that the harm outweighed the benefits, while others believed the research was not strong enough to prove harm, and that any harm is outweighed by the value of the information conveyed by the ads.

According to some panelists, blurred advertising can distract a child from the persuasive intent of an ad, causing them to rely more on emotion and less on rationality. Also, they said, research suggests that some methods of blurred advertising, such as the use of influencers, can be toxic to children, increasing eating disorders and adding to the current mental health crisis, especially when the advertising is targeted and prolonged. Other panelists argued that just because an advertisement works does not mean it’s harmful or deceptive. They also said that non-deceptive ads are protected under the First Amendment.

Panel Three: Looking Forward and Considering Solutions

The last panel discussed potential solutions to the challenge of stealth advertising, including disclosures, parental controls, educational programs, or even a ban on blurred advertising directed at children. As these panelists recognized, the solutions all come with limitations: (1) children cannot always read or understand disclosures; (2) parents don’t have the time or resources to monitor every piece of content their child consumes; (3) there’s a lack of resources for educational programs; and (4) a ban could face First Amendment issues.

Closing Remarks from FTC Associate Director Serena Viswanathan

In closing, Viswanathan stated that the FTC hopes to provide guidance and recommendations regarding how to comply with applicable laws and avoid problems associated with stealth advertising to kids. She said the FTC is following these issues with interest, eager to review the public comments, and continuing to engage with stakeholders.

* * *

That’s our quick summary for now. Stay tuned as we continue to track this topic and learn about the next steps the FTC may be planning in this area.

The next Ad Law News and Views newsletter is almost here to help you stay current on advertising, marketing, and and privacy law matters. Sign up here so you don't miss it .

]]>
Webinar Replay: Teen Privacy Law Update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/webinar-replay-teen-privacy-law-update Fri, 20 May 2022 12:22:16 -0400 The replay for our May 19, 2022 Teen Privacy Law Update webinar is available here.

Protecting the privacy and safety of kids and teens online is receiving enormous attention lately from Congress, the States, the FTC, and even the White House. Further, just last month, BBB National Programs unveiled a Teenage Privacy Program Roadmap offering a comprehensive framework for companies to use in identifying and avoiding online harms impacting teens.

Amidst these developments, Kelley Drye held a webinar to discuss the unique challenges associated with teen privacy. Dona J. Fraser, Senior Vice President Privacy Initiatives, BBB National Programs, and Claire Quinn, Chief Privacy Officer, PRIVO, along with Kelley Drye’s Laura Riposo VanDruff provided an update on key concerns and developments related to teen privacy, as well as practical tips for companies seeking to address these issues.

To view the webinar recording, click here or view it on the new Ad Law Access App.

Subscribe to the Ad Law Access blog to receive real-time updates on privacy and other related matters.

The Ad Law News and Views newsletter provides information on our upcoming events and a summary of recent blog posts and other publications.

Visit the Advertising and Privacy Law Resource Center for additional information, past webinars, and educational materials.

For easy access to all of our webinars, posts and podcasts, download our new Ad Law Access App.

Kelley Drye Unveils First-of-its-kind Advertising Law App
]]>
New Federal Bill to Protect Kids’ Privacy: Will This One Break Through? https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-federal-bill-to-protect-kids-privacy-will-this-one-break-through https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/new-federal-bill-to-protect-kids-privacy-will-this-one-break-through Tue, 22 Feb 2022 08:16:29 -0500 New Federal Bill to Protect Kids’ Privacy: Will This One Break Through?

Last October, we blogged that bipartisan momentum was building in Congress to enact stronger privacy protections for children, even if (and especially if) Congress remains stalled on broader federal privacy legislation. Of particular significance, we noted a strong push to protect, not just kids under 13 (the cutoff under COPPA), but also teens.

Since then, the momentum to enact stronger privacy protections for kids and teens has only increased, fueled by charges that social media and algorithms are causing self-harm and addictive behaviors by minors; multiple rounds of testimony from a former social media insider; and the desire in Congress to find common ground on some aspect of consumer privacy. Several kid/teen bills have been proposed in just the last couple months. (See for example here and here.)

The latest of these bills, introduced last week by Senators Blumenthal and Blackburn, has drawn a lot of attention – both because it’s bipartisan, and because these two Senators lead a key Senate subcommittee and held multiple hearings on algorithmic harms to teens. The bill (the Kids Online Safety Act or “KOSA”) has been endorsed by a number of organizations that focus on protecting kids’ safety and mental health. It also has drawn praise from Senator Cantwell, Chair of the Senate Commerce Committee, who told at least one media outlet that she is considering a committee markup on the bill.

KOSA’s stated purpose is to “require social media platforms to put the interests of children first” by establishing a “duty of care” to prevent harms to minors, “mak[ing] safety the default,” and enabling kids and parents “to help prevent the harmful effects of social media.” In announcing the bill, Blumenthal stated that it “would finally give kids and their parents the tools and safeguards they need to protect against toxic content—and hold Big Tech accountable for deeply dangerous algorithms.” Portions of the bill appear to be modeled after the UK’s Age Appropriate Design Code, a law that establishes content standards for minors, but is styled more like a guide setting forth principles and best practices. Here’s our summary of the bill’s key features:

  • It covers a wide range of entities. Although the press release and bill summary focus on social media platforms, the bill would extend to any “covered platform,” defined as “a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” This definition would reach a huge range of Internet-connected devices and online services. It also leaves open the question of what it means to be “reasonably likely to be used” by a minor. (Some of the bill’s provisions are triggered when a platform “reasonably believes” a user is a minor – a phrase that raises similar questions.)
  • It extends protections to any minor 16 or under. This contrasts with the under-13 cutoff in COPPA, the primary U.S. federal law protecting kids’ privacy. It’s not clear how this bill would interact with COPPA.
  • A covered platform has a duty of care to minors. It must act in the “best interests” of minors, including by preventing and mitigating “heightened risks of physical, emotional, developmental, or material harms” posed by materials on, or engagement with, the platform. Examples of such harm include: (1) self-harm, eating disorders, or other physical or mental health risks; (2) patterns of use indicating or encouraging addictive behaviors; (3) physical harm, online bullying, or harassment; (4) sexual exploitation; (5) promoting products that are illegal to minors; and (6) predatory, unfair, or deceptive marketing practices.
  • The platform must provide tools allowing minors or their parents to control the minor’s experience. These include “readily-accessible and easy-to-use” settings that can: (1) limit the ability of strangers to contact the minors; (2) prevent third-party or public access to a minor’s data; (3) limit features that would increase, sustain, or extend a minor’s use of the covered platform (e.g., automatically playing media); (4) permit opting out of algorithmic recommendations; (5) delete the minor’s account and personal data; (6) restrict sharing a minor’s geolocation information; and (7) limit time spent on the platform. The defaults for these settings must be the “strongest option[s] available” and the platform can’t use features that would encourage minors to weaken or turn off the safeguards. The bill does not specify whose choice would control if the parent and child both try to change the same settings.
  • The platform must enable parental controls by default for any user it reasonably believes to be a minor. These include tools allowing parents to: (1) control the minor’s privacy settings; (2) restrict purchases; (3) track the minor’s time on the platform; (4) change the default settings; and (5) control options necessary to prevent the harms described above. The platforms also must provide clear and conspicuous notice to the minor when parental controls are on, as well as a mechanism for a parent to submit reports of harm to a minor.
  • The platform must provide detailed disclosures about its safeguards, risks, algorithms, and advertising. As part of these requirements, the platform must obtain the minor’s or parent’s acknowledgement of the risks before the minor can use the platform; label and explain any advertising (including targeted advertising) aimed at minors; and allow minors or their parents to “modify the results of the algorithmic recommendation system” (as well as opt-out, as noted above).
  • Each year, the platform must obtain a third-party audit of the risks posed to minors and issue a public report. In addition to identifying the risks, the audit must address (1) what efforts the platform has taken to prevent or mitigate them; (2) how algorithms and targeted ads can harm minors; (3) how the platform collects and uses sensitive data, including geolocation, contacts, and health data; and (4) who is using the platform and for how long, by age ranges.
  • The bill gives the FTC APA rulemaking and civil penalty authority, and authorizes AG enforcement. Other provisions (1) give independent researchers access to the platform’s datasets; (2) direct the FTC and the Department of Commerce to establish guidelines for market or product research; (3) require a multi-agency study on age verification options; and (4) establish a Kids Online Safety Council to advise on the Act’s implementation.

Will this be the bill that breaks the federal privacy law stalemate and makes it into law? We suppose it’s possible. This bill is bipartisan, and Chair Cantwell is dangling the possibility of a markup – a rare event (at least lately) for a federal privacy bill. On the other hand, we’re already in an election year and Congress has a lot of other matters on its plate. Further, the extraordinary reach of the bill, coupled with its lack of clarity on a number of issues, suggest that many changes would be needed before this bill could become law.

Still, regardless of the outcome of this particular bill, it confirms what we predicted in October – that Congress has its sights on kids’ privacy, and that “kids” now includes teens 16 and under. Stay tuned.

Privacy Priorities for 2022

Please join us on Thursday, February 24 at 4:00 pm EST for Privacy Priorities for 2022, the second installment of Kelley Drye's 2022 practical privacy series. Register here.

]]>
App Developers Settle COPPA Violations Relating to Third-Party Ad Network Practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/app-developers-settle-coppa-violations-relating-to-third-party-ad-network-practices https://www.kelleydrye.com/viewpoints/blogs/ad-law-access/app-developers-settle-coppa-violations-relating-to-third-party-ad-network-practices Fri, 18 Dec 2015 15:00:25 -0500 This week, the FTC announced settlements with two mobile app developers – LAI Systems, LLC and Retro Dreamer (including two of its principals) – concerning allegations that their apps collected childrens’ personal information without obtaining parental consent in violation of COPPA. These cases are the first in which the FTC has held a company liable for COPPA violations relating to the information collection practices of a third-party ad network .

In separate complaints, FTC alleges that LAI and Retro Dreamer created a number of apps directed to children. The FTC's determination that the apps were kid-oriented was based on a number of factors, such as the subject matter, visual content, language, and use of animated characters or child-oriented activities and incentives. In both complaints, FTC alleges that the Defendants permitted third-party advertising networks to collect childrens’ PII in the form of persistent identifiers through the apps in order to serve targeted advertising on the app based on users’ activity over time and across sites (the FTC added persistent identifiers to the COPPA Rule’s definition of “personal information” when it updated the rule in 2013). The complaints, however, do not identify the specific persistent identifiers used.

FTC alleges that both LAI and Retro Dreamer failed to: (1) inform the ad networks that the apps were directed to children; (2) instruct or contractually require the ad networks to refrain from targeted ads; or (3) provide the required notices or obtain the required parental consent. In the case of Retro Dreamer, FTC also alleges that one of its advertising networks specifically warned the company about the obligations of the revised COPPA Rule, and also told the company that certain of its apps appeared to be targeted to children under the age of 13. The settlements prohibit the companies from further violations of the COPPA Rule. The settlement with LEI requires the company to pay a $60,000 civil penalty, while the settlement with Retro Dreamer requires it to pay a $300,000 civil penalty.

The settlements highlight that the FTC remains vigilant in this area. The agency will likely continue to closely monitor the information collection practices of website operators and app developers, in addition to third-party ad networks.

]]>