It May Be Time for TikTok to Change its Ways if State AGs Have Any Say
While State Attorneys General have been clear that social media companies are generally on their radar for a variety of consumer protection concerns, TikTok has been the latest to make headlines in recent weeks. For example, multiple states have banned TikTok from government phones, and a federal government ban may soon follow, because of concerns about the Chinese government’s control over the platform. Higher education institutions are also joining in the app’s ban with Georgia public colleges and the University of Oklahoma being two of the most recent to do so.
Earlier this month, Indiana Attorney General Todd Rokita filed two complaints focused on the TikTok app, which were based on two very different theories, including a foreign component in one. The cases both allege violations of the state’s Deceptive Consumer Sales Act.
Representations About Audience-Appropriateness
In its first complaint, the state alleges that TikTok makes a variety of misleading representations and omissions to claim a “12+” rating on the Apple App Store and a “T” for “Teen” rating in the Google Play and Microsoft Stores. Instead, the state asserts, TikTok should self-report ratings of “17+” (in the App Store) and “M” for “Mature” (in the Google Play and Microsoft Stores). To support these allegations, the state details the frequency and severity of alcohol, tobacco, and drug content, sexual content, nudity, mature/suggestive themes, and profanity on the TikTok platform, which it claims are much more frequent and severe than what TikTok self-reports. In its filings, the state also includes an affidavit by outside counsel detailing the mature content she recorded posing as a teen user and the number of views related to specific videos.
In addition, the complaint takes aim at TikTok’s “Restricted Mode,” a feature which can be used by parents to limit inappropriate content. The state alleges that much of the mature content described in its complaint was also accessible in Restricted Mode. Moreover, the complaint alleges that TikTok actually suggests mature content through the functionality of its “Autocomplete” search feature and by including content on a user’s personalized “For You” page.
Privacy and Data Security Representations
Indiana’s second complaint alleges TikTok misleads consumers about the risks the app poses to their data. Per the complaint, these risks stem from TikTok and its algorithm being owned by ByteDance Ltd., a Chinese company subject to Chinese law. These laws, according to Indiana, include mandated secret cooperation with China’s intelligence activities.
The state is seeking injunctive relief as well as civil penalties through both complaints.
In a related matter, fifteen state AGs, led by Montana Attorney General Austin Knudsen, sent letters to Apple and Google demanding the companies change their app store age ratings for TikTok to 17+ and Mature, respectively by the end of the year. Of interest, the letters also state, “Some of us are already pursuing or considering legal action against TikTok…” These actions follow publicly announced investigations of TikTok for its impact on the physical and mental health of its young users.
Practical Takeaways
While the outcome of the Indiana’s actions remains to be seen, it is clear that children’s safety and data protection – particularly in connection with social media -- continues to be a hot topic for state AGs and other regulators such as the FTC. In light of this scrutiny, app developers and owners may all want to consider the following questions:
- Does your app’s rating accurately reflect the known maturity level of the app’s content? Is your user-base consistent with the content level?
- If your app or content platform allows self-reporting of content age levels, what framework is in place to audit or confirm the accuracy of such reports, and what steps are you taking when you identify children using your app? This may be important for both general consumer protection compliance as well as compliance with privacy laws such as the Children’s Online Privacy Protection Act, something also raised by the FTC’s recent action against Epic Games.
- If your company is touting parental controls, enhancements made for children’s safety, or general privacy for general, are those protections working and what are you doing to verify this? AGs may be looking closely into whether companies are delivering on these types of claims.