Safer’s 2024 Influence Report | Safer by Thorn


Thank you for reading this post, don't forget to subscribe!

At Thorn, we’re devoted to constructing cutting-edge expertise to defend youngsters from sexual abuse. Key to this mission is our youngster sexual abuse materials (CSAM)  and CSE detection resolution, Safer, which permits tech platforms to seek out and report CSAM and text-based harms on their platforms. In 2024, we had extra firms than ever deploy Safer on their platforms. This widespread dedication to youngster security is essential to constructing a safer web and utilizing expertise as a power for good. 

Safer’s 2024 Influence

Despite the fact that Safer’s group of consumers spans a variety of industries, all of them host content material uploaded by their customers or textual content inputs in generative engines and messaging options.

Safer empowers their groups to detect, evaluation, and report CSAM and text-based youngster sexual exploitation at scale. The scope of this detection is crucial. It means their content material moderators and belief and security groups can discover CSAM amid the hundreds of thousands of content material recordsdata uploaded and flag potential exploitation amid hundreds of thousands of messages shared. This effectivity saves time and accelerates their efforts. Simply as importantly, Safer permits groups to report CSAM or situations of on-line enticement to central reporting businesses, just like the Nationwide Middle for Lacking & Exploited Kids (NCMEC), which is crucial for youngster sufferer identification.

Safer’s clients depend on our predictive synthetic intelligence and a complete hash database to assist them discover CSAM and potential exploitation. With their assist, we’re making strides towards lowering on-line sexual harms in opposition to youngsters and making a safer web.

Whole recordsdata processed

In 2024, Safer processed 112.3 billion recordsdata enter by our clients. As we speak, the Safer group contains greater than 60 platforms, with hundreds of thousands of customers sharing an unbelievable quantity of content material each day. This represents a considerable basis for the necessary work of stopping repeated and viral sharing of CSAM on-line.

Whole potential CSAM recordsdata detected

Safer detected slightly below 2,000,000 pictures and movies of recognized CSAM in 2024. This implies Safer matched the recordsdata’ hashes to verified hash values from trusted sources, figuring out them as CSAM. A hash is sort of a digital fingerprint, and utilizing them permits Safer to programmatically decide if that file has beforehand been verified as CSAM by NCMEC or different NGOs.

Along with detecting recognized CSAM, our predictive AI detected greater than 2,200,000 recordsdata of potential novel CSAM. Safer’s picture and video classifiers use machine studying to foretell whether or not new content material is prone to be CSAM and flag it for additional evaluation. Figuring out and verifying novel CSAM permits it to be added to the hash library, accelerating future detection.

Altogether, Safer detected greater than 4,100,000 recordsdata of recognized or potential CSAM.

Whole strains of textual content processed

Safer launched a textual content classifier function in 2024 and processed greater than 3,000,000 strains of textual content in simply the primary yr. This functionality gives a complete new dimension of detection, serving to platforms establish sextortion and different abuse behaviors occurring by way of textual content or messaging options. In all, virtually 3,200 strains of potential youngster exploitation had been recognized, serving to content material moderators reply to doubtlessly threatening conduct. 

Safer’s all-time influence

Final yr was a watershed second for Safer, with the group virtually doubling the all-time complete of recordsdata processed. Since 2019, Safer has processed 228.8 billion recordsdata and three million strains of textual content, ensuing within the detection of just about 6.5 million potential CSAM recordsdata and almost 3,200 situations of potential youngster exploitation. Each file processed, and each potential match made, helps create a safer web for youngsters and content material platform customers.

Construct a Safer web

Curbing platform misuse and addressing on-line sexual harms in opposition to youngsters requires an “all-hands” strategy. Too many platforms nonetheless endure from siloed groups, inconsistent practices, and coverage gaps that jeopardize efficient content material moderation. Thorn is right here to alter that, and Safer is the reply.