Last Updated on 19th November 2021

What is it?

The Age Appropriate Design Code (sometimes referred to as The Children’s Code) is a new code of practice that sets out standards of age-appropriate design for “information society services” that are likely to be accessed by children. In other words, it’s new requirements for any companies that offer services online that are likely to be accessed by children.

The Code itself contains 15 standards that any online services must adhere to and covers areas such as privacy, transparency, and data sharing. You can find the full list of standards below. It first came into force in September 2020, but companies were given a 12-month transition period to comply, which ended on September 2nd, 2021.

young children on phones

The Age Appropriate Design Code applies to anyone under 18 years old.

What Are The 15 Codes of Standards?

The code sets out 15 standards for companies to adhere to, which are:

1. Best interests of the child
2. Data protection impact assessments
3. Age appropriate application
4. Transparency
5. Detrimental use of data
6. Policies and community standards
7. Default settings
8. Data minimisation
9. Data sharing
10. Geolocation
11. Parental controls
12. Profiling
13. Nudge techniques
14. Connected toys and devices
15. Online tools

The full details of each standard of the Age Appropriate Design Code and what they each mean can be found here.

Why Has the Code Been Created?

From playing games on parents’ phones and watching cartoons on YouTube, to getting their own devices and joining social media, children and young people are using the online world every day. However, the internet wasn’t created with safeguarding children in mind and nor were any previous rules and regulations that the companies who operate online must follow. This is especially important now due to personal data protection.

As adults, we hopefully have more understanding of what we’re agreeing to online. For example, we might understand what we’re signing up to when we agree to data usage pop-ups, whilst children might not. Or when we allow an app to use geolocation, we understand the risks behind location sharing, where children may just see the novelty in sharing this information.

These new standards are about making the digital space where children learn, play and socialise a safer place to be.

“For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore, and play. This statutory code of practice looks to change that, not by seeking to protect children from the digital world, but by protecting them within it.”

The Information Commissioner Office

The code stems from the United Nations Convention on the Rights of the Child (UNCRC). It recognises the special safeguards children need in all aspects of their life and within GDPR laws. This code is the first of its kind, with the potential to influence changes to data protection policies in other countries.

For platforms that don’t comply, there could be serious consequences. The ICO retains enforcement powers under GDPR and other relevant laws. If companies breach the age-appropriate design code of practice, they can be served with warnings, notices, and fines.

Who Created It?

The Code comes from The United Kingdom’s Information Commissioner’s Office (ICO), who are the UK’s independent body that upholds information rights. They are responsible for legislation such as the Data Protection Act, GDPR, and Freedom of Information Act. This new code comes under their remit due to its relevancy to information rights, data protection and privacy of electronic communications.

Who Does It Apply To?

The code applies to companies that fall under the bracket of information society services. Simply put, this is any business that provides a service online in exchange for money. It doesn’t necessarily mean the user is paying the company directly, the company could be gaining money through advertising and/or data.
This includes:

  • programs
  • apps
  • search engines
  • social media platforms
  • online messaging or internet-based voice services
  • online marketplaces
  • content streaming services e.g. video, music, or gaming services
  • online games
  • news or educational websites
  • any websites offering other goods or services to users over the internet

It’s not just UK-based companies that are affected; even non-UK companies will have to comply if they process the personal data of UK children.

This means apps and websites such as YouTube, Facebook, and Google will all have to adhere to the new standards. Even if the service isn’t necessarily aimed at children, the code must be implemented if it can be accessed by children under 18.

What Changes Will I See?

There have been changes across thousands of apps and websites throughout the last 12 months, some more noticeable than others. Changes will have included:

  • Checking the age of the people who visit the website, download an app, or play the game.
  • Switching off geolocation services by default for users under 18.

  • Prohibiting nudge techniques to encourage children to enter more personal data.
  • Providing the highest level of privacy by default.
  • Greater efforts to protect the privacy and security of children online.

You can learn more about recent changes to some of the most popular platforms through our further resources section below.

What Does It Mean for Parents, Carers, and Safeguarding Professionals?

Although the Age Appropriate Design Code goes a long way in helping keep our children and young people safe, it’s doesn’t mean we should stop taking action to help safeguard them. It’s still important that we take steps to keeping our children safe online, such as:

  • Engage young people in a conversation. Make sure they know the importance and value of protecting their personal information online and only share location data with those they trust.  
  • Check that they know who to talk to if someone makes them feel uncomfortable online, such as a Trusted Adult.
  • Teach young people about the importance of knowing how to report and block on the platforms they use.
  • Check that all the devices your children own or have access to have the best safety settings enabled. Speak to the children in your care about their safety and privacy settings online, you should also check that they limit public access to their social media images. 

  • Scroll down to the bottom of the page to access further resources and support on keeping children and young people safe online.
man and woman with young person device

Further Resources

  • To help you feel more confident in assisting your young person in setting and reviewing their privacy settings on popular platforms, we would encourage you to visit our Safety Centre. You’ll find the latest information on how to block or report, password protection, and so much more.
  • Sign up for our free Safeguarding Hub newsletter below, which will provide you with the latest in safeguarding news and alerts.
  • Find out more about new code on the Information Commissioner’s Office website.

Join our Safeguarding Hub Newsletter Network

Members of our network receive weekly updates on the trends, risks and threats to children and young people online. 

Sign Up

Who are your Trusted Adults?

The Trusted Adult video explains who young people might speak to and includes examples of trusted adults, charities and organisations.

Pause, Think and Plan

Use our video for guidance and advice around constructing conversations about the online world with the children in your care.

Levelling Up: A Round Up of Our Best Gaming Resources

Safeguarding Alert: Online Trend

Talking to Young People About Suicide

Roblox – What You Need to Know

The Back to School Guide for an Online Generation