OpenAI sent 80 times as many child exploitation incident reports to the National Center for Missing & Exploited Children during the first half of 2025 as it did during a similar time period in 2024, according to a recent update from the company. The NCMEC’s CyberTipline is a Congressionally authorized clearinghouse for reporting child sexual abuse material (CSAM) and other forms of child exploitation.
Companies are required by law to report apparent child exploitation to the CyberTipline. When a company sends a report, NCMEC reviews it and then forwards it to the appropriate law enforcement agency for investigation.


(To be clear to anyone reading, you may want to explain your quotes are from the article)
While it’s true the total number of reports is higher than the count for individual pieces of content, it’s hardly a “shit misleading headline” given that the count of content reported is still 22x higher than this time last year (74559 vs 3252 individual pieces of content).
Ars has a pretty good point, and oddly the only reason it’s 80x higher is that in 2024h1 they had 1/3 as many reports as reported content, and in 2025h1 the reported content to number of reports is pretty much 1:1. That disparity is interesting, and I would like to know more. If only someone had written quite a good article about it that goes into much greater depth about the causes.
(they also explicitly list the numbers I cite here, in context with the report numbers - so doing a better job than I am. I encourage people to read the article on this one, its actually quite good)