The rapid evolution of Artificial Intelligence (AI) brings numerous benefits, but also poses significant risks, particularly concerning the creation and sharing of child sexual abuse material (CSAM), also referred to as AI-CSAM.
A new guide, created by the National Crime Agency (NCA) and the Internet Watch Foundation (IWF) provides essential information for all professionals working with children and young people.
Why are these guides crucial for professionals?
- Stay Informed: Learn about the various ways AI can be misused to create child sexual abuse imagery, including the manipulation of existing images or the generation of entirely new content.
- Understand the Law: Gain clarity on the legal implications of AI-child sexual abuse material, which is considered illegal even if it’s not photorealistic.
- Effective Response: Access clear guidance on how to respond to incidents involving AI-CSAM, treating them with the same urgency and care as any other child sexual abuse issue. This includes reporting procedures and crucial “dos and don’ts” when encountering such material.
- Support for Victims: Learn about providing wellbeing support to victims, who may experience significant emotional and psychological impact.
The guide will equip education practitioners and all those who support children and young people in the UK with the knowledge they need to appropriately respond to incidents involving AI-generated child sexual abuse material.
The new guide makes it clear AI child sexual abuse imagery “should be treated with the same level of care, urgency and safeguarding response as any other incidence involving child sexual abuse material” and aims to dispel any misconception that AI imagery causes less harm than real photographs or videos.
The NCA and IWF are united in their commitment to protecting children online and will continue to work together to identify new opportunities for collaboration on this issue.
Children and young people can use the Report Remove tool from the IWF and Childline to report AI-generated child sexual abuse material that has been shared or might be shared online.