Share icon
Sign up for our newsletter
Close icon

    Enter your email below to receive a free e-newsletter with the latest IWF news, industry and parliamentary updates and events direct to your inbox.

    Report remove

    To support young people to remove sexual images or videos of themselves online, the IWF and NSPCC developed the world-first Report Remove tool which was launched in June 2021.

    The NSPCC’s Childline service ensures that the young person is safeguarded and supported throughout the process and the IWF assesses the reported content and takes action if it meets the threshold of illegality. The content is given a unique digital fingerprint (a hash) which is then shared with internet companies to help prevent the imagery from being uploaded or redistributed online.

    This solution provides a child-centered approach to image removal which can be done entirely online. The young person does not need to tell anyone who they are, they can make the report at any time, and further information and support is always available from Childline.

    Young people create or sign into a Childline account which then allows them to receive Childline email updates about their report. Young people can use this email service for ongoing support, and they can contact a Childline counsellor via online chat and via their freephone number. They can also access relevant information and advice, self-help tools and peer support on the Childline website.

    • In 2022, we received 187 reports through the Report Remove tool.
    • We were able to take action on 101 reports.

    Analysis of ‘actionable’ reports

    These are reports which were assessed as containing images and/or videos or were URLs containing such images and videos of child sexual abuse according to UK legislation.

    Of the 101 reports which we assessed as criminal images, most contained Category C images (69%), and more boys reported to us than girls, with boys making up 73% of the total.

    Download icon

    This chart provides an overview of the sex of the child depicted in the images or videos sent through the Report Remove tool. Most – almost three quarters – came from boys.

    Sexually coerced extortion

    A quarter (24% or 18 reports) of the 74 actionable reports from boys were because of sexually coerced extortion. We’ve seen how boys are typically lured into what they believe are mutual exchanges of sexual images where they mistakenly believe they are sharing images with a peer or older person.

    We know that sexually coerced extortion is behind these specific reports from boys as they have included evidence of this within their report. This could be a chat log where the young person has demonstrated that they are being coerced or where a collage of images has been created by the offender, overlaid with threatening text. No such evidence was present against any of the images of girls.

    Download icon

    This chart shows how boys aged 16 and 17 represent the biggest user group of the Report Remove service.

    Download icon

    Most images and videos reported by young people through Report Remove (which are assessed as child sexual abuse material) fall into Category C, with a notable amount of imagery of boys assessed as Category B.

    Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism.
    Category B: Images involving non-penetrative sexual activity.
    Category C: Other indecent images not falling within categories A or B.