The United Kingdom’s Information Commissioner’s Office (ICO) finalized a new Code of Practice (the Code) in September 2020, which applies to most companies that offer online services to or otherwise collect personal data from users in the UK, if their service is likely to be accessed by children. Given the finalization of this Code, there is no better time to resolve to stay current on children’s privacy laws.
By publishing the Code, the ICO is signaling how it intends to interpret the data protection principles under the EU General Data Protection Regulation (GDPR) and UK’s Privacy of Electronic Communications Regulation (PECR). As such, this means that violations of the Code could lead to GDPR-standard fines: up to €20 million/£17.5 million, or 4% of a company’s annual worldwide turnover, whichever is higher.
Fortunately, the Code includes a 12-month transition period before enforcement begins, so companies should start learning about the new requirements and taking steps to comply now before enforcement starts in September 2021.
The Code consists of 15 standards to which all subject online services will need to adhere, all while keeping in mind a holistic, risk-based approach that considers a child’s best interest. While some of the standards will likely be familiar to many companies with developed privacy programs, such as data minimization, transparency and restrictions on data sharing, companies that are accustomed to handling underage privacy issues may nonetheless be surprised by some of the Code’s new requirements. Some of the most impactful changes are outlined below.
Most global privacy laws set a minimum age (13 for the US, and 16 for many EU countries) under which no personal data may be collected from a child, unless a parent gives consent or an established exception applies. The Code does not change the minimum age for data processing in the UK (it remains 13 years old), but separately asserts that children require special protections online until they are 18 years old. The Code requires companies to treat children differently depending on their age group (0-5, 6-9, 10-12, 13-15 or 16-17, for example) by tailoring a company’s privacy policy language to fit the reading comprehension level of each group.
Some of the ICO’s requirements, such as the requirements to limit data sharing and personalization efforts by default, are likely to have an impact on many companies, especially services with large numbers of UK users between the ages of 16-18 years old who will now need additional protection under the new guidance.
The U.S. Children’s Online Privacy Protection Act (COPPA) is one of the most well-known (and enforced) children’s privacy laws in the world. Notably, COPPA does not require parental consent in situations where a child’s data is used only for optimization, personalization, statistical reporting, and other functions necessary to analyze and support the “internal operations” of the service. The ICO takes a much stricter view, holding that “collection of personal data in order to ‘improve,’ ‘enhance,’ or ‘personalize’ your users’ online experience [is] beyond the provision of your core service” and requires separate consent. Combine this strict interpretation with the ICO’s position that consent for non-essential processing must be provided by the UK child’s parent where the child is under 13, and this provision will likely to have a major business impact on the practical ability of child-directed apps to perform analytics or otherwise use user data from UK residents to improve their services.
That said, it is unclear whether the Code would necessarily prohibit the collection of de-identified data for analytics purposes, or the extent to which a company can define its personalization efforts as part of its “core service.” With this in mind, we recommend exercising caution before defining a “core service” too broadly: the ICO writes “although there may be some limited examples of services where behavioral advertising is part of the core service (e.g. a voucher or ‘money off’ service), we think these will be exceptional. In most cases the funding model will be distinct from the core service and so should be subject to a privacy setting that is ‘off’ by default.”
One of the Code’s standards is a requirement to consider the “best interest of the child” when making design decisions around data– this concept is referenced frequently in the other standards of the Code as well. According to the Code, determining the best interest of a child is a holistic analysis, incorporating a variety of factors, such as the child’s rights to freedom of expression, thought, conscience, religion, association, access to information, age-appropriate play, protection against economic, sexual or other forms of exploitation, and more.
Child privacy laws and enforcement have traditionally overlapped with other child-protection initiatives. For example, the Italian Data Protection Authority recently placed a temporary ban on TikTok following the death of a 10-year-old girl from Palermo, who suffocated after participating in a ‘choking challenge’ on the social media platform. (This is in addition to a ban on TikTok from the Indian government for activities deemed “prejudicial to sovereignty and integrity of India,” suggesting a high level of mistrust with respect to these social network services internationally.)
Nevertheless, the Code makes clear that the ICO intends to use GDPR/PECR (and its associated penalties) as the enforcement hook for just about any “detrimental impact” on a child that can be traced back to their data. For example, the ICO would consider the following activities to be violations enforceable under the Code:
Although many companies provide robust parental controls designed to allow parents to monitor and control their children’s use of a given service, the Code recognizes that children have certain privacy rights specifically against their parents. For example, the code requires that children must be notified any time when a parent is monitoring their activity. This requirement may not be intuitive to many companies, especially those in cultures with a more expansive view of a parent’s right to monitor and control their child’s online behavior. This creates another potential conflict with COPPA, which mandates that operators of services directed to children under 13 give parents the unqualified ability to access and delete personal information that a service collects from their children.
For companies that already take steps to comply with children’s privacy laws like COPPA and GDPR-K, below are some items to focus on in 2021:
Work towards kid-friendly privacy disclosures in your privacy policy and “just-in-time” notices. Companies offering online services that appeal to children should consider adding a “kid-friendly” version of their privacy policy to supplement their more robust legal disclosures. The code also recommends diagrams, cartoons and/or graphics to explain privacy concepts to children between the ages of 6 and 12.
While the Code presents a number of new challenges for companies, now is the best time for companies to think through their compliance programs and implement changes, always keeping in mind a smart, risk-based approach. Of course, there will be even more to consider on the horizon, with the Irish Data Protection Commission announcing its own code as well, which is still in drafting stages until March 2021. As always, Fenwick’s attorneys are here to help you navigate these constantly shifting standards.