FTC, YouTube, and Kids’ Privacy: Key Takeaways from the Biggest COPPA Settlement in FTC History

PRIVACY BULLETIN HEADER IMAGE

On September 4, 2019, the U.S. Federal Trade Commission announced Google and YouTube will pay a record $170 million as part of a settlement over allegations that YouTube violated the Children’s Online Privacy Protection Act (COPPA). In many ways, this settlement is the next step of an approach to COPPA enforcement the FTC started with the TikTok case, which we’ve covered extensively in the past.

One might be tempted to write off this case as simply “TikTok 2.0,” but the settlement presents a novel interpretation of COPPA liability for third-party intermediaries, and offers important new takeaways for all companies to consider for reducing their risk. Essentially, the FTC appears to be moving away from traditional “child-directed” offerings, such as TinyCo and Disney subsidiary Playdom, into enforcing against “general audience” services and third parties that do not operate child-directed services but instead have actual knowledge that they receive personal information from child-directed business partners.

Background

YouTube is a well-known video streaming service owned by Google that is at least as popular with adults as it is for children. One report notes that 96 percent of 18- to 24-year-old internet users in the United States use YouTube, and YouTube reaches more 18- to 34-year-olds in the U.S. than any TV network. Nevertheless, YouTube is clearly popular with kids as well; the FTC complaint alleges that “YouTube was the #1 website regularly visited by kids” and “the #1 source where children discover new toys + games.”

The FTC settlement requires Google and YouTube to pay $136 million to the FTC and $34 million to the New York State Attorney General, which had launched its own investigation. The $136 million penalty is by far the largest amount the FTC has ever obtained in a COPPA case since Congress enacted the law in 1998. As part of the settlement, Google and YouTube will implement a system to identify child-directed content on YouTube, notify channel owners that their child-directed content may be subject to COPPA’s obligations, and provide annual training about complying with COPPA for employees who deal with YouTube channel owners.

Liability Turns on “Actual Knowledge” of Users’ Content

Importantly, the FTC complaint did not allege that YouTube as a whole is directed towards children, or that Google itself tried to market the YouTube service to children. Rather, the FTC complaint held Google liable based on the child-directed nature of content uploaded by YouTube’s users.

Under COPPA 16 CFR § 312.2, a third party “Web site or online service shall be deemed directed to children when it has actual knowledge that it is collecting personal information directly from users of another Web site or online service directed to children.” (emphasis added.) In other words, while a service provider of child-directed content is certainly subject to COPPA, a service provider can also be liable if it knowingly collects personal data on someone else’s child-directed site or service.

The key factor here is the provider’s actual knowledge. Providers that receive personal information from child-directed services are not automatically responsible for treating that information in compliance with COPPA. Nor are providers obligated in all circumstances to inquire as to whether their business partners’ services are directed to children. Nevertheless, this case shows that the FTC may allege, based on a provider’s conduct, that they must have had actual knowledge that they collected information from child-directed sites or services.

In this case, the FTC alleged that Google and YouTube violated COPPA by collecting personal information from and targeting behavioral advertising to viewers on channels they knew to be child-directed without first obtaining verifiable parental consent. The FTC alleged that Google and YouTube knew which channels were child-directed because of YouTube’s own content rating system (videos rated “Y” were “generally intended for ages 0-7”), the branding of these channels (e.g., content related to well-established kids’ IPs like Barbie and Cartoon Network), communications between channel owners and YouTube representatives, representations made by the channel owners in the channels’ “About” section, and Google’s decision to highlight the channels as part of the separate “YouTube Kids” service (described in more detail below).

Because Google and YouTube knew they worked with channels that were child-directed, according to the FTC, they should have disabled behavioral advertising on them, or implemented a verifiable parental consent method before kids could access them.

Key Takeaways for Platforms, Ad Networks and Content Providers

The FTC’s settlement with Google and YouTube offers several important lessons to platforms, ad networks and content providers.

Takeaway 1 (for all companies): Your sales and marketing staff’s statements may be used against you. The FTC alleged that the YouTube marketing team bragged about YouTube’s popularity with kids, and even said to a business partner that the platform did not need to comply with COPPA because it was a general-audience site.

Best Practice: In any company, communication is critical. Companies should always ensure their legal department knows how the marketing team is selling the company’s services, and make sure they are making accurate and legally-approved statements. Sales and marketing staff should receive periodic training on heightened areas of legal risk that touch their business.



Takeaway 2 (for platforms and ad networks): User-generated content sites can be subject to and liable under COPPA if you have “actual knowledge” that your users upload child-directed content to your service.
This case has implications for any company that allows the uploading of user-generated content (UGC) on its platform, as well as any advertising or analytics company that collects personal data from its partners’ apps.

Unlike laws like Section 230 of the Communications Decency Act, which immunizes online service providers from being treated as publishers of their users’ content regardless of the service provider’s knowledge, this action shows that the FTC will seek to hold a platform responsible for complying with COPPA if it knows its users’ content is directed to children. If a general audience service provider learns that a user or third-party partner is providing child-directed content, such as apps, forums or chat rooms, it must either disable its personal data collection on that site, or seek verifiable parental consent prior to that collection.

Best Practice: While COPPA does not currently require platform providers or ad networks to affirmatively monitor their users’/business partners’ content to see if it is child-directed, many ad networks as a best practice already require their app developer partners to self-certify upfront whether their apps are child-directed, in order to reduce their risk.



Takeaway 3 (for platforms and ad networks): Having a “child zone” isn’t enough, and in fact, could even put the “adult zone” content at greater risk.
YouTube was unique in that it provided a separate “YouTube Kids” app, designed to provide a COPPA-compliant experience for kids. However, this compliance plan arguably backfired, because the same video content YouTube hosted, curated and promoted for YouTube kids could also be found on the main YouTube site, with behavioral advertising enabled. The FTC seized on YouTube’s selection of content for YouTube kids as additional evidence that it knew the same content on the general YouTube site was child-directed.

Best Practice: It’s not an inherently bad idea to section off a portion of your services as “for kids,” and treat the kids’ content separately from a data perspective. This segregation is typically an effective compliance strategy. However, this case shows that doing that is not enough by itself, especially for UGC services. If content from the kids’ section is also being packaged into collections on the general audience section of a service, the platform operator should treat those collections as subject to COPPA, and either disable behavioral advertising or require verifiable parental consent as needed.



Takeaway 4 (for content providers): Content providers should expect increasing pressure from platform or advertising partners to classify their content as either child- or adult-directed.
Following this settlement, YouTube reportedly has plans to use machine-learning algorithms to detect child-directed video content in the future. As such, content providers on YouTube and other platforms may be concerned about lower revenues from child-directed content, due to increased restrictions on behavioral advertising.

Best Practice: Content creators will have to adapt to platforms’ machine-learning technology to avoid having their content misclassified. That said, many companies in kids’ media have been surprised to learn that revenues from contextual advertising are not substantially lower than revenues from behavioral advertising. This may be because kids are not as influenced by personalized ad experiences compared to adults.



These are complicated issues, and the law may change again soon: the FTC is considering amendments to COPPA in a workshop planned for this October. As always, companies should work closely with privacy counsel during this period of enhanced focus on children’s privacy, and take all necessary precautions now.

Login

Don’t have an account yet?

Register