Help protect younger children from sexual abuse.
Every 2 minutes our analysts in Cambridge remove a photo online of a child suffering sexual abuse.
As one of the world’s leading organisations fighting online child sexual abuse we rely on the generous support of members of the public, charitable giving bodies and the business community. Your support will help us continue and increase our vital work helping these victims.
The NSPCC’s Childline service ensures that the young person is safeguarded and supported throughout the process and the IWF assesses the reported content and takes action if it meets the threshold of illegality. The content is given a unique digital fingerprint (a hash) which is then shared with internet companies to help prevent the imagery from being uploaded or redistributed online.
This solution provides a child-centered approach to image removal which can be done entirely online. The young person does not need to tell anyone who they are, they can make the report at any time, and further information and support is always available from Childline.
Young people create or sign into a Childline account which then allows them to receive Childline email updates about their report. Young people can use this email service for ongoing support, and they can contact a Childline counsellor via online chat and via their freephone number. They can also access relevant information and advice, self-help tools and peer support on the Childline website.
These are reports which were assessed as containing images and/or videos or were URLs containing such images and videos of child sexual abuse according to UK legislation.
Of the 101 reports which we assessed as criminal images, most contained Category C images (69%), and more boys reported to us than girls, with boys making up 73% of the total.
This chart provides an overview of the sex of the child depicted in the images or videos sent through the Report Remove tool. Most – almost three quarters – came from boys.
A quarter (24% or 18 reports) of the 74 actionable reports from boys were because of sexually coerced extortion. We’ve seen how boys are typically lured into what they believe are mutual exchanges of sexual images where they mistakenly believe they are sharing images with a peer or older person.
We know that sexually coerced extortion is behind these specific reports from boys as they have included evidence of this within their report. This could be a chat log where the young person has demonstrated that they are being coerced or where a collage of images has been created by the offender, overlaid with threatening text. No such evidence was present against any of the images of girls.
This chart shows how boys aged 16 and 17 represent the biggest user group of the Report Remove service.
Most images and videos reported by young people through Report Remove (which are assessed as child sexual abuse material) fall into Category C, with a notable amount of imagery of boys assessed as Category B.
Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism.
Category B: Images involving non-penetrative sexual activity.
Category C: Other indecent images not falling within categories A or B.