Project Artemis reviews text-based conversations and evaluates whether they could be considered grooming. It assigns a rating, and companies can use those ratings to flag conversations for review by human moderators.
The project began in November 2018 at a Microsoft “360 Cross-Industry Hackathon.” Since then, Microsoft, The Meet Group, Roblox, Kik, Thorn and others have helped build the tool. Beginning tomorrow, January 10th, licensing will be handled by Thorn, a nonprofit that builds tech to defend children from sexual abuse.
In a blog post announcing Project Artemis, Microsoft wrote:
“Project Artemis” is a significant step forward, but it is by no means a panacea. Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems. But we are not deterred by the complexity and intricacy of such issues. On the contrary, we are making the tool available at this point in time to invite further contributions and engagement from other technology companies and organizations with the goal of continuous improvement and refinement.
Microsoft is not the only Big Tech company fighting child exploitation and abuse. Last year, YouTube pulled hundreds of channels and disabled comments on tens of millions of videos after reports suggested a child porn ring existed on the platform. Facebook has said it uses machine learning to fight child exploitation, and the Tumblr app was once removed from the App Store when images depicting child sexual abuse got past its filters.