Filters and moderators are essential for a clean experience, said Claire Quinn, safety chief at a smaller site aimed at kids and young teens, Wee World.
Duncan, one of a half-dozen law enforcement officials interviewed who praised Facebook for triggering inquiries, said: 'The manner and speed with which they contacted us gave us the ability to respond as soon as possible.'Facebook is among the many companies that are embracing a combination of new technologies and human monitoring to thwart sex predators.
Such efforts generally start with automated screening for inappropriate language and exchanges of personal information, and extend to using the records of convicted pedophiles' online chats to teach the software what to seek out.
Metaverse Chief Executive Amy Pritchard said that in five years her staff only intercepted something 'terrifying' once, about a month ago, when a man on a discussion board for a major media company was asking for the email address of a young site user.
Software recognised that the same person had been making similar requests of others and flagged the account for Metaverse moderators.
From a business perspective, however, there are powerful reasons not to be so restrictive, starting with teen expectations of more freedom of expression as they age.
If they don't find it on one site, they will somewhere else.
Most sex crimes against children are committed by people the children know, rather than strangers.
Even those companies with state-of-the-art defenses spend far more time trying to stop online bullying and attempts to sneak profanity past automatic word filters than they do fending off sex predators.
They called the media company, which then alerted authorities.