
Providing easy and accessible ways to report abuse on ZEPETO is crucial in safeguarding our users’ safety and experience. Earlier this month, we unveiled our new reporting interface in the ZEPETO app, which was recrafted to make the reporting process more straightforward, comprehensive, consistent, and accessible for users across all of our features.
Easier and more comprehensive reporting and blocking

Presenting “clear escalation pathways and reporting on all user safety concerns” is a key Safety by Design principle. We’ve incorporated this by doing the following:
- Consolidated in-app reporting menus across features including on Feed, Profile, World, DM, and Crew.
- Expanded reporting options to cover all potential Community Guideline violations to empower users to report all safety concerns.
- Listed highest harm-potential violations at the top of the reporting menu for quicker access (i.e. violations involving child safety and suicide or self-harm related incidents)
- Used plain, succinct, and simple language to cater to younger users.
- Provided links to relevant Help Center pages for more information on reporting cybercrimes, copyright violations, and accessing crisis resources.
- Equipped users with additional touch points within the reporting process to block the person they are reporting.
Increasing awareness and improving users’ understanding of ZEPETO’s Community Guidelines

Guidelines are only as effective as they are comprehensible. We believe that ensuring our community clearly understands what is and is not allowed on our platform is our responsibility.
Our new reporting categories now align with our updated Community Guidelines. When users select a specific reason for reporting within a category, we also list examples of the types of content we remove based on that particular Community Guideline policy.
Enforcing actions with clarity and transparency
Our newly updated reporting system aims to provide clarity to both the user who reports and the user who gets reported. Research suggests that when moderated users are given explanations on actioned content, their subsequent post-removal rate decreases.
When we take action on a report, we now send information on what specific policy was violated in our Community Guidelines to the reported user.
The road ahead
Revamping user reporting is just the beginning of our broader initiative toward increased consistency in moderation and greater transparency as a platform. This update enables us to measure and track harmful content more accurately, monitor our progress, and share meaningful assessments of reported abuse to uphold the transparency and accountability of ZEPETO. In the coming months, we look forward to sharing more updates on the improved tools and processes we build toward a safer metaverse.
By – Elyse Lee, PMP, Head of Trust and Safety Policy and Partnership and Jackie Lho, Manager, Trust and Safety Policy & Partnership