Moderating the Metaverse: A Conversation with Ravi Yekkanti

Category Technology

tldr #

Ravi Yekkanti works in content moderation for virtual reality experiences, known as the metaverse. He spoke to me about how moderating in the metaverse is different than reviewing texts and videos and mentioned situations where he is taunted or bullied based on his ethnicity. Moderators need to stay undercover, prepare for different scenarios, track every event occuring in the metaverse, and report bad behavior to the client or law enforcement.


content #

I chatted with Ravi Yekkanti, who works for a third-party content moderation company called WebPurify that provides services to metaverse companies. Ravi moderates these environments and trains others to do the same. He told me he runs into bad behavior every day, but he loves his job, and takes pride in how important it is. We get into how his job works in my story this week, but there was so much more fascinating detail to our conversation than I could get into in that format, and I wanted to share the rest of it with you here.

WebPurify was founded in 2009

Here’s what Ravi had to say, in his own words: "I started working in this field in 2014. By now I’ve looked at more than a billion pieces of content like texts, images, and videos. Since day one, I always loved what I did. That’s weird coming from someone who is working in moderation, but I started in the field by working on reviews of movies, books, and music. It was like an extension of my hobbies.

Ravi has moderated for over 8 years and has looked at over a billion peices of content.

The major difference is the experience. VR moderation feels so real. I have reviewed a lot of content, but this is definitely different because you are actually moderating the behavior.And you are also part of it, so what you do and who you are can trigger bad behavior in another player. I’m Indian with an accent, and this can trigger some kind of bullying behavior from other players. They might come to me, say something nasty, and try to taunt me or bully me based on my ethnicity.

Customer service and moderating the metaverse go hand in hand to ensure the safety of those in the game.

We do not reveal, of course, that we are moderators. We have to maintain our cover because that might make them cautious or something.

Yeah, it definitely feels different. When I put on the VR headset for the very first time in my life, I was awestruck. I had no words to explain the experience. It felt so good. When I started doing moderation in VR and trying out games with other players, it was a little intimidating. It could be because of the language difference, or it could be because you are conscious that you’re meeting people who you’ve never met from all over the world. There is also no such thing as my personal space.

Global moderators are needed to ensure safety in different languages.

First, we prepare technically. So we go over our policy to be undercover and act as hosts in the game. We are expected to start conversations, ask other players if they are having a good time, and teach them how to play the game.

The second aspect of preparation is related to mental health. Not all players behave the way you want them to behave. Sometimes people come just to be nasty. We prepare by going over different kinds of scenarios that you can come across and how to best handle them.

Objects in the metaverse can be owned by players.

We also track everything. We track what game we are playing, what players joined the game, what time we started the game, what time we are ending the game. What was the conversation about during the game? Is the player using bad language? Is the player being abusive? .

Sometimes we find behavior that is borderline, like someone using a bad word out of frustration. We still track it, because there might be children on the platform. And sometimes the behavior exceeds a certain limit, like if it is becoming too personal, and we have more options for that.

Metaverse's are becoming more popular and are projected to be worth 25 billion dollars by 2025.

Well, we create a weekly report based on our tracking and submit it to the client. Depending on the repetition of bad behavior from a player, the client might decide to take some action.

And if the behavior is really extreme, and if it violates any local laws, then we come into action. We report it to the local law enforcement and take action according to that.


hashtags #
worddensity #

Share