I just have a hard time believing anyone working Roblox itself is completely ignorant to problems like these, I think it just knows it's one of the most successful kids' mmos still around and that clueless parents will still pay for Robux for their kids, so it doesn't care for doing anymore than the absolute bare minimum to moderate the site and even then, that's only when the spotlight is shined on them to do so before they can return to business as usual.
I work as a data analyst and software developer and in my spare time I made a Discord bot that does a lot of moderation-related stuff. One of the things it does is apply scoring with AI to mark texts for how inappropriate they are in a number of different categories, one of which (part of the OpenAI moderation toolkit) is content that talks sexually about minors. My bot is in a few thousand servers now and I collect all the messages from all of them and from this I feel I've learned two main things:
1. The number of pedophiles on the internet is fucking
staggering. Anyone on KiwiFarms knows there's a lot of freaks online, but it's so much worse than I thought when I started building this.
2. The infrastructure to apply this kind of moderation is way cheaper than you think, way more effective than traditional techniques like exact match word blacklists, and can be implemented in a couple lines of code. It's really simple. I can't stress enough how simple it is to implement. It's so simple that if you don't know anything about programming at all, you could throw it together in Python in 20 minutes of reading documentation.
This experience has been really blackpilling for me about the nature of the internet. Maybe I'm naive and should've realized earlier how bad it is out there, but it's shocking how much these companies
do not fucking care. They know there are kids on their platform, they know there are groomers on their platform, they know they could build tools to at least kick them off of their own platforms, but they don't do it. The tools exist to solve this but the lack of interest is at multiple levels; the companies don't want to say "oopsies, our platform is full of pedophiles, but
now we're taking care of it." You can report these people to the government, but there's not really any material incentive for them to care in the way that tax fraud or drug cases have, and so the resources allocated to them are pretty minimal. You can report it to whoever moderates things on a local level to at least get these freaks banned from whatever particular place they're trying to chat up kids in, but more often than not, the moderators of those spaces are pedo freaks too.
I don't wanna derail the thread too much with this. Maybe sometime I'll make a whole thread about what I've learned from running this tool. Anyway, it's just insane to me how little everyone seems to give a fuck about pedophiles operating more or less out in the open. Everyone is really comfortable thinking that somebody else will handle it, but the end result is that it just festers indefinitely. I don't know what the solution is but things are looking pretty grim for today's kids.