Photo representing The ICO Children’s Code

The ICO Children’s Code

The ICO Children’s Code, also formally known as the Age Appropriate Design Code, supports organisations to design digital services that meet children’s needs, respect their rights, and help them to explore, play, and grow online.

The code is grounded in the UN Convention on the Rights of the Child (UNCRC) which states in Article 3: “In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities, or legislative bodies, the best interests of the child shall be a primary consideration.”

The basic rights outlined in UNCRC cove a range of developmental, health, safety, and privacy issues. The Code does also recognise that each child is an individual and should be regarded as such. An internet search finds an array of advice about keeping children safe in the digital world. Much of it is very generalised and fails to recognise the individuality of children.

Photo representing The ICO Children’s Code

This is why the Children’s Code, and UK Data Privacy Legislation, makes use of the Data Protection Impact Assessment (DPIA) risk management tool. In the document introducing the code the UK privacy regulator, the Information Commissioners Office (ICO), say: “The DPIA is an important process in which you can consider and document how you use personal data, what risks that might pose, and the solutions for addressing the risks. Assessing children’s best interests is an important part of the DPIA process. This is obligatory for all organisations in-scope of the Children’s Code.”

The ICO has published the ‘Best interests of the child self-assessment’ on its website, along with a suite of tools, templates, and other resources to help organisations understand whether they are acting in the best interests of children through their data protection policies and procedures.

The Children’s Code, considered to be a robust set of considerations and recommendations rather than further UK legislation will support those organisations involved with Artificial Intelligence (AI), software development, and the provision of certain online services to children.

CSRB are supporters of software enhanced learning for children and understand that the current generation of young people live their lives online to a great extent, and that this will continue to be the case. Nonetheless, it is essential that the personal data of our children is protected in line with the UNSRC, and relevant UK and International privacy legislation and regulations (e.g. UK GDPR).

CSRB can oversee Data Protection Impact Assessments (DPIA) with organisations involved with AI and software development, putting children’s privacy, at the very heart of every software design brief. By supporting these organisations with certified data protection support at an early stage of the design process, we can ensure that data privacy by design, is built in to new software products and services, protecting the child, their parents, guardians, and carers.

CSRB can assist with communicating both Data Controller and Data Processor UK GDPR responsibilities, the accountability actions organisations must demonstrate to Data Subjects through organisational practices and safeguards. Our practitioners undertake monthly continual professional development training enabling them to provide up-to-date certified advice to our clients and wider business community.

Recital 38 of UK GDPR says: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned, and their rights in relation to the processing.”

CSRB is committed to protecting the privacy of children, whilst allowing commercial organisations to develop and launch innovative new online support services for children.

Not sure on your responsibilities as outlined in the new ICO Children’s Code? Please get in touch with us here or call 0117 325 0830 to get that certified support and peace of mind regarding child privacy.