Many well-respected tech companies came together to form a consortium called Project Artemis. The culmination of this multi-year, multi-company effort was a multi-faceted text-based analysis system explicitly designed to detect child grooming. Several companies have developed systems to detect predatory behavior but almost no information about them has been shared publicly and so development efforts to combat a global problem have been conducted in isolation.
Teenagers sometimes masquerade as adults. Artemis is an A.I. system that relies on a database of common textual patterns derived from confirmed child predator conversations. It includes a risk detection engine, which is designed to assess the degree of risk a conversation poses based on content. The flexible design allows companies to incorporate their custom detection methods into it so multiple industries can use it.
The database of common textual patterns employed by child predators was compiled using data from multiple companies and platforms. Also, unlike most text processing engines, Artemis is capable of analyzing statements over multiple turns of a conversation. Finally, the modular architecture of Artemis permits adaptation and expansion based on the needs of different platforms, which allows platforms like The Meet Group to add our own features and functionality to it.
The Meet Group added real-time processing functionality and support for profile metadata as well as two NLP algorithms. These additions improve Artemis not only for The Meet Group but for all companies who use it. We also enabled Artemis to determine when a user has lied about his or her age — a technique Meet Group developed prior to Artemis. Given the collaboration behind Artemis, we believe it is an essential component of any social-oriented company’s predator detection toolkit. The Meet Group are honored to have participated in this project and commend Microsoft on its openness in broadly sharing this capability across the industry to protect children everywhere.
Read more here.