This should be the first indicator that it wasn't just the game or its contents that were at issue.
They're very clearly referring to actual abuse imagery, in addition to the game.
If it was just game/anime content, that would be specified outright, not implied.

We see a timeline here. Based on this alone, it's established that NCMEC's report was over actual CSAM.
The police didn't even know about the game until after searching his PC. The Cybertip was not related to the game. He had CSAM.

The software they used to do so employs both hash-matching against known CSAM content, in addition to image classifiers designed broadly sort sexual content, rather than specifically possible CSAM. It wasn't until they found screenshots that they would discover the game itself.

NCMEC maintains most of image hashes which they share to automate detection of CSAM, and does not hash images that do not meet the US definition of CP.
They audited their own PhotoDNA hash database and removed 5 'cartoon/anime' files for this reason.



The first paragraph here is referring explicitly to photos and videos of real abuse material.'10 illegal photos and films' are specific terms, and lines up perfectly with the first article.
Mentioning the game thereafter does not confer virtuality to the 10 photos.

Emphasis added. Specific use of the word 'photo' literally means that they were of real children.
Also, wouldn't it be easier to share game content from his PC, and not his phone??

The article writer's choice to hyperfocus on the game, and not the other images, shouldn't be read as exclusionary to anything else.
There is no implication or inference that the 10 other images were exclusively from the game.
I think this whole ordeal was disappointing to see. I could say more about how to properly infer things, but there's an accessibility barrier and a language barrier in play here, even with the translation.
Please be mindful of negativity bias and analyze more. Please.