ZEPETO Safety Center

“Mommy, what is the metaverse?”

This was the first question my six-year-old son asked me, right after I joined NAVER Z as the lead for Trust and Safety Policy and Partnership. 

Today, our updated Community Guidelines and Studio Content Guidelines go into full effect.
In these updated Guidelines, we have outlined the values we stand for and our policy rationale for the content and conduct we prohibit. In order to better protect our community, it is our responsibility to clearly communicate what ZEPETO is and isn’t.

The key questions we aimed to address as we drafted our updated guidelines were:

1) What is the metaverse?

The concept of metaverse is still forming, without a shared, globally agreed upon definition. To begin updating our Guidelines, we first needed to ask ourselves the same question my son asked me on my first day: what is the metaverse?

We do not know yet what the metaverse will come to be. But we are focused on building a virtual space where anyone can collaborate to create their dream community and content.
We are building a place easily accessible to everyone where meaningful connections, creativity, diversity, and free expression are celebrated.

Ultimately, we want our community to find joy on our platform and have fun. But none of this can come before or at the expense of other users’ safety.

2) How might digital safety risks be exacerbated in the metaverse?
The metaverse is immersive. The risks of “unwanted conduct” are heightened as unwanted individuals can come into someone’s virtual space.  This means both content and conduct need to be monitored and regulated. With this in mind, we updated our Guidelines to apply to both inappropriate or harmful user content and abusive or harmful conduct in the form of inappropriate avatar behaviors.

3) How can we protect the most vulnerable?

We care deeply about the safety and well-being of all members of our community. We are also mindful that certain groups–such as minors, women, people of color, and LGBTQ communities– are particularly vulnerable to becoming the target of online harm, including sexual exploitation, hateful activities, harassment, and bullying.

Since the beginning of this year, we’ve doubled down on our commitment to protect our younger users by partnering with minor safety experts and publishing our Guardian’s Guide
Now, in our updated Community Guidelines, we highlight minor safety as our key priority.  We will also consider off-platform behaviors when reviewing accounts for Community Guidelines violations if we become aware of harmful behaviors toward minors off-platform with credible evidence.

We continue to strictly prohibit hateful activities based on protected characteristics, bullying, and harassment on our platform.

4) How do we protect freedom of expression while preventing harmful and dangerous content and conduct? 

While we take a firm stance on the harmful content and behaviors that impede our community’s safety, our updated Guidelines also emphasize the importance of viewpoint diversity based on civility, respect, and inclusivity.

Resolving tensions between freedom of expression and safety is not an easy task. But we can start by looking at international human rights law and its guiding principles. 

The international human rights law provides principles of legality, legitimacy, and necessity in limiting expression to achieve the twin goal of moderating toxic content while protecting freedom of expression. Although we may not always get things right, these principles will guide our policy development and moderation decisions as we continue to learn and adapt.

5) What’s next?

Having clear, accessible Community Guidelines is critical in our effort to empower our users. Clear comprehension and understanding of guidelines can also increase users’ likelihood of abiding by these policies. Research suggests that when moderated users are provided with explanations on actioned content, their subsequent post-removal rate decreases.

But these updates are only the beginning. Our Guidelines will continue to be shaped by evolving risks, our users’ feedback and experience, and other internal and external input to best serve the ZEPETO community. 

As our platform grows, providing decisional and rulemaking transparency to our community will become more crucial.  We support the voices that call for the industry to involve those impacted by tech in developing tech. We are committed to building a safer metaverse by embodying our key values of safety, inclusion, diversity, dignity, and creativity. 

Thank you,

By Elyse Lee, PMP, Head of Trust and Safety Policy and Partnership

Translate »
%d bloggers like this: