The ICO’s recent publication of its Age Appropriate Design Code adds another layer to recent discussions about the responsibilities of technology platforms to protect young users from harm.
The Code sets out 15 standards that organisations providing online services to children need to implement. It is aimed to help those organisations design their services in a way that complies with the General Data Protection Regulation (GDPR), which makes it clear that children’s personal data requires specific protections, as children are likely to have a lower level of understanding of the risks of the processing of their data.
The scope of the Code is broad. Although the likes of Facebook and Google have been front and centre of much of the commentary around online harm, the Code is by no means limited to these household name tech giants. Any organisation providing an online service to children will need to take the Code into account when designing child-facing services.
The concept of an ‘online service’ is wide and includes apps, search engines, social media platforms, online messaging services, online marketplaces, content streaming services, voice over IP services, news and educational websites and connected devices. The Code applies not just to services that are designed specifically with children in mind, but also to services that are likely to be accessed by children.
In some ways, the standards reflect principles that organisations should already be taking into account when providing services to children, but they flesh out the requirements some more and give a better idea of regulatory expectations of what compliance will look like. The standards include taking into account the child’s best interests as a primary consideration, so that children are kept safe from exploitation risks and have their wellbeing appropriately supported.
The ICO expects data protection impact assessments to be completed to assess and mitigate risks to children’s rights and freedoms when processing their personal data. Privacy information must be in language suited to the age of the child and additional, specific “bite-sized” explanations of how personal data will be processed must be included at the point the use commences.
The Code suggests allowing children to choose whether they see a basic or a more detailed version of privacy information, depending on their age and understanding. Personal data of children should not be used in ways that are detrimental to their wellbeing and default privacy settings must be as privacy-protective as possible. In particular, geolocation and profiling must be switched off by default and children should not be nudged towards changing to less protective settings.
The Code will now be laid before Parliament and is expected to take effect by autumn 2021, after which providers will have 12 months to make sure existing services comply. The Code itself isn’t legally binding, but if organisations don’t comply with it from the end of this 12-month period, they will find it harder to demonstrate compliance with the GDPR.
The ICO will take the Code into account when determining appropriate enforcement action, so failing to comply with the Code is more likely to lead to penalties, including fines of up to €20m or 4% of annual worldwide turnover. In the midst of continued scrutiny of online platforms, the Code demonstrates that regulators are committed to holding providers to account when their use of personal data results in adverse effects for those who are particularly vulnerable.
This publication is intended for general guidance and represents our understanding of the relevant law and practice as at January 2020. Specific advice should be sought for specific cases. For more information see our terms & conditions.