Global menu

Our global pages

Close

ICO’s ‘Children’s Code’ applies from today – what you need to know

  • United Kingdom
  • Global
  • Privacy, data protection and cybersecurity

02-09-2021

 

What do you need to know?

The Information Commissioner’s Office (“ICO”) ‘Age Appropriate Design Code’ applies from 2 September 2021.

The code comprises a set of standards for designing and providing online services to ensure that they safeguard the personal data of children (being any individual under the age of 18) and requires certain information society service (“ISS”) providers, to comply with these standards.

The code came into force on 2 September 2020, with a 12 month transition period to give organisations time to prepare. The ICO set up a Children’s Code Hub to provide organisations with advice and resources to help them achieve compliance. You should have already considered whether the code will apply to the online services you are providing, taking into account the nature of your service and the way in which the service is accessed, including any age restrictions. If your online service, or any element of it, may appeal to (or be likely to be utilised by) children, you should have built in high privacy settings by default to ensure child users are protected, including limiting the personal data collected to the minimum needed. As an alternative, you may have chosen to apply the code’s standards to all users, so that all users receive enhanced basic protection in respect of the use of their personal data.

In more detail…

Background

The aim of the code is to provide practical guidance on building data protection safeguards into online services and ensuring they are appropriate for use by children.

The ICO was required to prepare the code before 23 November 2019 under section 123 of the Data Protection Act 2018 (“DPA”). In April 2019, the ICO released the draft code for consultation. The consultation ran between June and December 2018. 97 responses were received from stakeholders including children’s charities, social media sites and broadcasters.

On 22 November 2019, the ICO confirmed it had submitted the final version of its Age Appropriate Code of Practice (dubbed, the “Children’s Code” or “Kids Code” by some) to the Secretary of State in preparation for it to be laid in Parliament. The final code was published on 22 January 2020.

The code overlaps with other areas, including marketing, literacy, child protection, broadcasting and gaming. The ICO encourages providers to consider many other codes of practice, including those produced by the Committee of Advertising Practice and the Office for Fair Trading, in addition to any relevant Government or medical guidance which are relevant to children. This may require liaising with stakeholders across your business.

Who needs to comply with the code?

The code applies to ISS providers offering services such as apps, websites, content streaming services and electronic services for controlling connected toys, which are likely to be accessed by children (i.e. such access must be more probable than not). For the purposes of compliance with the code, the ICO considers a child to be anyone under the age of 18 (this may seem strange to specialists in the area as a child can typically enforce their own privacy rights in this area from the age of 13). 

The code says service providers should take a common sense approach to the question of whether a service is likely to be accessed by children – taking into account:

  1. the nature and content of the service and whether that has particular appeal for children; and
  2. the way in which the service is accessed and any measures put in place to prevent children from gaining access. (On the topic of age verification, the ICO recently confirmed in a blog post that it is considering how organisations can tackle age assurance and will be formally setting out its position in the Autumn 2021.)

Organisations that determine that their service is not likely to be accessed by children and therefore that they will not be implementing the code, should document and support their reasons behind that decision.

The code applies to online services based in the UK and also online services based outside the UK with a branch, office or other establishment in the UK and processing personal data in the context of the activities of that establishment. In addition, the code applies to service providers established outside the UK and offering services to UK users, or monitoring the behaviour of users in the UK.

It should be noted that this definition is extremely broad. A website or service does not need to be a paid for service to fall within this definition and even sites advertising purely offline services can still be caught, if otherwise falling within the above requirements.

What if you don’t comply?

The code will be considered by the ICO and the courts when enforcing the UK GDPR and other relevant legislation. The ICO has the power to issue fines of up to £17.5 million or 4% of annual worldwide turnover, whichever is higher.

What does the code say you need to do?

The code sets out 15 cumulative and interdependent standards of age appropriate design, which must be implemented to ensure compliance with the code. The standards are summarised at the beginning of the code, and we have extracted the key requirements for you below:

1. Best interests of the child

When designing and developing online services, the best interests of the child should be a primary consideration and balanced against other interests. To meet this standard, ISS providers should consider how their use of a child user’s personal data can, amongst other things:

  • keep them safe from exploitation risks, including of commercial or sexual exploitation;
  • protect and support their physical, psychological and emotional development and health;
  • protect and support their need to develop their own views and identity and their right to freedom of association and play; and
  • recognise the role of parents in protecting and promoting the best interests of the child and support them in this task.

2. Data protection impact assessment

The code recommends using data protection impact assessments (“DPIA”) to help identify and minimise data protection risks which arise from the processing of the personal data of children likely to access the ISS. This is an extension to the legal requirement to undertake DPIAs in advance of high risk processing activities.

A DPIA should be embedded into the design of the ISS at an early stage (and before the processing has commenced). It should describe the processing and identify and assess potential risks which may impact children, such as self-esteem issues, online grooming or other sexual exploitation and excessive screen time. Measures to combat such risks should be identified and implemented. Consultation with children and parents may be required, depending on resources and the risks identified.

A DPIA template is annexed to the code and is available here.  

3. Age appropriate application

ISS providers should ensure that all children are protected and so the level of protection should be appropriate for their age. This involves establishing (with a level of certainty appropriate to the risks to the rights and freedoms that arise from the relevant data processing) what age range the individual users of the ISS fall into, so that privacy protections and safeguards can be tailored accordingly.

ISS providers are asked to consider the age of children accessing their service. To do this, ISS providers can use third party age verification services or use artificial intelligence to assess the user’s age by analysing the way in which they interact with the service. Self-declaration without further evidence will only be appropriate for low risk processing or where used in conjunction with other techniques. Controversially, the code also suggests that ISS providers can ask for formal identity documents (e.g. passports) if the risks involved in processing are particularly high. We would recommend thinking carefully before collecting such data.

The following development stages, categorised by age, should be used as a guide when determining what is needed for ‘age-appropriate design’:

  • 0-5: pre-literate and early literacy
  • 6-9: core primary school years
  • 10-12: transition years
  • 13-15: early teens
  • 16-17: approaching adulthood
  • ·

The code’s standards should be adapted for the particular age range of your users. For instance, if you expect that your service will be accessed by children aged between 6 and 9, you should consider providing privacy information through video or other formats which children will be more likely to access and understand the information.

If the age of the relevant ISS users cannot be established with any level of certainty, you should apply the code’s standards to all users and ensure that all users receive some basic protections in how their personal data is used by default.

4. Transparency

ISS providers should explain what they do with children’s personal data in an easy to find location. This information should also be provided in “bite-sized” chunks at the point the personal data is activated (known as a “just in time notice”).

Information should be presented in a child friendly way, which may mean using diagrams, cartoons, video and audio content. It should be tailored to the age of the children accessing the service. This may mean allowing children or parents to up-scale or down-scale the information depending on their level of understanding.

5.Detrimental use of data

Children’s personal data must not be used in ways which obviously, or have been shown to be, detrimental to their health or wellbeing or that goes against industry codes of practice, other regulatory provisions or Government advice on the welfare of children.

Examples include:

  • using personal data in ways which encourage children to continue using the service, such as offering personalised in-game advantages;
  • aggressive commercial practices; or
  • marketing certain products, such as food and drinks which are high in fat, salt or sugar food and drinks.

ISS providers should keep up to date with relevant standards and codes of practice within their industry and Government advice on the welfare of children.

6. Policies and community standards

ISS providers should adhere to their own published terms and conditions and policies. In particular, they should only use personal data in accordance with their privacy policy and uphold any user behaviour policies.

7. Default settings

Privacy settings must be set to ‘high privacy’ by default. This means that, unless the child amends their settings themselves or there is a compelling reason for a different default setting:

  • their personal data cannot be accessed by other users; and
  • only the personal data needed to provide each individual element of your service is collected and is used only to the extent necessary for the provision of that element of the service.

If children do change their privacy settings, they should be given clear and age appropriate explanations of the consequences of doing so and it should be easy to revert back to high privacy settings.

If children share access to your service with an adult, you should ensure that each user can have their own profile with their own individual privacy settings. Children should not be exposed to the lower privacy settings of an adult with whom they share an account.

8. Data minimisation

ISS providers should only collect the minimum amount of personal data needed to deliver the element(s) of the ISS in which the child is actively and knowingly engaged.

You should differentiate between each individual element of you service and identify what personal data you need to deliver it. Children should be given as much choice as possible over which elements of the service they want to use and therefore how much personal data they need to provide.

9. Data sharing

ISS providers should not disclose children’s data unless they can demonstrate a compelling reason to do so, taking account of the best interests of the child. High privacy settings should be the default position so data sharing should already be limited and only permitted when children actively change their default settings. Settings which allow general or unlimited sharing will not be compliant. Compelling reasons to disclose children’s data would include for safeguarding purposes or detecting crimes against children.

10. Geolocation

Geolocation tracking options should be turned off by default (unless you can demonstrate a compelling reason for the setting to be turned on by default, taking account of the best interests of the child).

Age appropriate information should be provided in each case alerting the child to the use of geo-location data or profiling and prompting them to discuss this with an adult if necessary. It should also be obvious to the child when location tracking is active, such as by use of a clear symbol visible to the user. Once the child has finished their session, the tracking should be switched off,  unless there is a compelling reason to do otherwise.

11. Parental controls

Parental controls, whilst an important way of helping parents to protect their child’s best interests, impact upon the child’s right to privacy. Age appropriate information should be provided to notify the child to the fact that parental controls are in place, including an alert when any parental monitoring or tracking is in place.

12. Profiling

Profiling involves any automated processing of personal data, usually based on a user’s online activity or browsing history, to:

  • find something out about an individual, such as their preference, location, and economic situation;
  • predict their behaviour; and/or
  • make decisions about them.

Profiling can be a useful tool for determining which content, including advertising content, a user might be interested in, but can also be used in establishing a child’s age, for child protection and the prevention of crime.

Children should generally be offered the option to turn profiling ‘on’ or ‘off’. The exception is where profiling is necessary to deliver the service the child has chosen to use. An example is where profiling is used for age verification purposes, as this is necessary in providing an age appropriate service in compliance with the GDPR and the ICO’s code. For any non-essential profiling, profiling should be ‘off’ by default and options to turn on profiling should differentiate between separate services.

Cookies and similar technologies are often used for profiling in ‘remembering’ a user’s device and storing information about their references or past actions. Unless the service that the child wants to access cannot be delivered without a cookie, under PECR you must obtain the child’s consent for using it. PECR also requires that clear, age appropriate information should also be provided about the use of all cookies.

13. Nudge techniques

Nudge techniques are design features which ‘nudge’ users into choosing a particular option, such as a ‘yes’ option being larger and more prominent than a ‘no’ option.

Such techniques should not be used to make poor privacy decisions, such as to encourage children to give unnecessary personal data or turn off privacy protections. Depending on the age of the user, ‘pro-privacy nudges’ can be used where appropriate to direct younger children to choose high privacy options and parental controls and involvement.  

14. Connected toys and devices

Children’s toys and other devices which are connected to the internet are subject to the code. Such toys can cause issues because they can collect a large amount of data via functions such as cameras and microphones. The code particularly warns against ‘passive’ collection of personal data. It should be clear when personal data is being collected, for example by a light that switches on when the device is audio recording.

15. Online tools

Online tools can be used to help children exercise their rights and report concerns. For instance, a child can use a ‘download all my data’ tool to exercise their rights of access and to data portability and a ‘stop using my data’ to restrict or object to processing.

ISS providers are encouraged to provide these tools in ways that are easy for the child to find and use. They should be tailored to the age of child in question, for instance video or audio material and prompts to seek help from a parent or trusted adult should be provided for young children.

Consent and parental consent

A lawful basis is needed for any processing of personal data. There are six lawful bases under the GDPR, one of which is consent. As with any processing, an assessment of the suitable lawful bases for any service should be undertaken and recording in your privacy notice(s) (and DPIA in the case of services covered by the code).

Where you choose to rely upon consent as a lawful basis, you must ensure you obtain a consent which meets the standards of the GDPR, i.e. freely given, specific, informed and unambiguous. You must provide a positive opt-in method of consent which is clear and which is separate from your terms and conditions.

When do you need to get parental consent?

The code provides that if ISS providers:

  • make your service available to children; and
  • you rely on consent as your lawful basis for any processing, use of cookies, profiling or processing of special category (or sensitive) personal data,

they should make ‘reasonable efforts’ to obtain and verify parental consent for children under 13. The code does not specify any particular steps. Your approach should depend on the risk of the processing. You will also need to comply with the GDPR in relation to any age or identity verification information you collect, bearing in mind the purpose limitation, data minimisation, storage limitation and security principles. A data protection impact assessment may help in deciding how to verify age and parental responsibility.