
Online grooming is a set of tactics that abusers use to connect with minors online and gain their trust to eventually coerce them into sexual behaviors (e.g. sharing sexual imagery, engaging in sexual conversations, or even meeting in-person).
A survey of 1,200 youth (ages 9-17) conducted by Thorn, a nonprofit that builds technology and programs to defend children from sexual abuse, highlighted that 40% of respondents had been approached by someone who they thought was attempting to ‘befriend and manipulate’ them. It also highlighted that minors were ‘regularly encouraged to leave open forums to one-on-one environments by online-only contacts.’ [1]
Grooming is a behavior that is extremely difficult to detect because conversations between groomers and children aren’t always explicitly sexual in nature or centered around age. Message contents that are okay to exchange between two adults can be considered grooming when exchanged between an adult and a minor. With nearly 10 million chat messages being exchanged on our platform daily, technological detection of online grooming in text-based conversations will be critical in identifying and removing predators and establishing a safer environment for our users.
What we’re doing
We are working to enhance our internal capabilities on preventing and detecting online grooming and child sexual exploitation across two key dimensions in ZEPETO:
- Expand our technological approach on preventative grooming detection
We are working in collaboration with Thorn to tackle this complex issue. Thorn has developed an ensemble of three models, making up the full Thorn Grooming Classifier, to detect grooming conversations through machine learning technologies. We will be integrating this tool into chatting features on our platform to more effectively identify and escalate early indicators of predatory behavior to our team of dedicated moderators. Once reviewed, we’ll take strict action against accounts that have engaged in conversations indicative of grooming behaviors and immediately make a report to authorities in adherence with local laws, rules, and regulations.
- Continue prevention of Child Sexual Abuse Material (CSAM) through automated and human based moderation
One of the most harmful interactions during online grooming is the exchange of CSAM. [2] In order to prevent such acts from happening on our platform, we have automated and human-based moderation in place to detect and remove images and videos of child sexual exploitation.
Though we are continuously working to improve our automated detection methods, there are limitations when it comes to these tools. Grooming is an intricate behavior, and the technology alone cannot prevent predators from acting on our platform.
Our community will need to work together to fight online grooming and create a more safe and enjoyable experience for our users and the wider digital community. Read our guide on fighting online grooming as a community to stay informed on red flag grooming behaviors to look out for, understand what you can do to protect yourself or your children, and how to seek help.
ZEPETO does not tolerate child grooming on our platform. Our Trust and Safety team is committed to the safety of minors and are working to ensure that our platform is a safe environment for our younger users and entire community.
[1] https://info.thorn.org/hubfs/Research/2022_Online_Grooming_Report.pdf
[2] https://www.inhope.org/EN/articles/the-impact-of-online-grooming
By Sophia Yoo – Senior Data Analyst, Trust & Safety