2022-03-15 08:05Press release

Towards next generation chat moderation in Star Stable Online

Towards next generation chat moderation in Star Stable Online

Toxicity online is part of a growing public discourse around the impact of technology on our lived experience. Many look to the games industry for answers. With the rise of global player communities connected to major franchises increasing the incidence of toxicity, tackling bullying, harassment, and predatory behaviors became a priority in our sector more than a decade ago.

By outlining unacceptable actions and consequences for transgressors, leading studios responded with a rulebook that set the standard for in-game moderation. With soft preventative measures and hard corrective measures, this rulebook discourages players from doing potentially harmful things by threatening censorship, punishment, and exclusion. With that, it addresses the most immediate threat of toxicity.

However, it fails to answer the growing need for game platforms to nurture friendly and inclusive communities by nurturing more friendly and inclusive players. For young people, interactions and relationships fostered in digital spaces - particularly game platforms - are formative. As online communities become embedded sites of personal and interpersonal growth, the industry must move beyond policing and punishment and take responsibility for cultivating civilized online cultures.

It’s time for next generation chat moderation, and Star Stable Entertainment is leading the way. In an innovative pilot project, we successfully reduced the level of toxicity in our flagship game, Star Stable Online (SSO), through positive behavioral nudging.

In this pilot summary, we’ll first describe our rationale for the project and its twin mechanics, advanced artificial intelligence (AI), and social and emotional learning (SEL) principles. Next, we’ll look at the pilot and its findings in detail. Finally, we’ll look at why the results should inform the future of chat moderation in games.

Pilot background

Solving moderation in Star Stable Online

Star Stable Online is a massively multiplayer online role-playing game (MMO RPG) targeted at girls aged eight to 18. Launched in 2011, the game has grown year on year, with 20 million registrations across 180 territories and 14 languages. Set in a fantasy world, players complete magical adventure quests on horseback and socialize on open chats and member-only Riding Club chats. Riding Clubs are exclusive groups established by and for players with shared niche interests, like equestrian disciplines and genre literature. As the only place players directly communicate in the game, these chats are flashpoints for toxic behavior.

Of the thousands of texts published in the chats every day, four percent are problematic. This includes spam and links, but primarily it accounts for words and phrases we prohibit. Exceptional compared to the wider games industry, four percent is incomparably low next to the levels of toxicity observed on social media platforms. However, a paradigm shift in chat moderation makes it our high-water mark, as Director of Customer Experience, Cecilia Munthe explains:

“There are three parts to what I call the moderation puzzle. There’s the policing element that ensures things that absolutely should not be part of the chat are not part of the chat. There’s the self-moderation piece, where players alert us to problems. And then there’s a third part about nurturing positive behaviors. This pilot was about finding that final piece.

New ways of working

Having used AI-powered solutions in front-line moderation, we saw the potential for machine learning to develop the final piece of the moderation puzzle - positive behavioral nudging. Expanding its use in SSO was initiated by Executive and Technical Producer Jane Skullman. Her professional interest in the subject and a long-running dialogue with AI specialists Oterlu inspired the idea. “The team was asking, do we want to be the police or pals,” Jane recalls. “But I had a vision that with Oterlu’s help, we could instead create a community that is more kind.”

Oterlu was a natural fit for an initiative seeking new approaches to online safety. “We’re interested in how tech and artificial intelligence, in particular, can do more than just prevent bad stuff from happening on digital platforms,” Founder and CEO Alexander Gee explains.

In collaboration, the pilot was devised. Its first innovation would be to apply Oterlu’s cutting-edge techniques to develop a bespoke AI model with a native grasp of the language used by our players. This fluency would give us a moderation filter with unprecedented nuance. That would eliminate texts censored in error and help us detect opportunities for behavioral interventions.

The intervention itself would be the pilot’s second innovation. It was a message designed to nudge players in violation of our guidance toward positive action. The text drew on pioneering social and emotional learning principles unknown in the games industry. Briefly, social and emotional learning pivots around five pillars: self-management, self-awareness, social awareness, relationship skills, and responsible decision-making. Each pillar encompasses a variety of personal and interpersonal skills.

Here we relied on social and emotional learning experts Peppy Agency. Co-founder and CEO Paulina Olsson shares our conviction that moderation doesn’t have to rely on punishments like censored messages and bans. That route, she warns, can leave children unaware of their hurtful behavior and perhaps feeling worse than when they exhibited the negative actions. A proactive approach can put us in

“non-judging conversations with players about their behavior.”

Pilot details

Moderation beyond binaries

Typically, studios nominate words and phrases for exclusion and apply an automated filter to detect and block them. Generally, they exclude expletives, words depicting violent acts, and threats of physical harm. In this pilot, an advanced AI model learned the language of SSO chats to a level of proficiency that meant it didn't merely detect and block forbidden messages, it extrapolated and inferred meaning. Alexander explains:

“AI grows naturally, extrapolating and inferring meaning from a combination of the language we tell it to ban and what it picks up in the community. This technology can identify messages designed to manipulate the filter and tell when the usage of a word is problematic and when it isn't. This nuance is missing from your typical chat filter.”

This nuance moved our moderation solutions beyond the binary of acceptable and unacceptable. Of course, machine learning requires training to accurately classify data. As Jane cautions,

“AI is not a silver bullet. It has to learn context. As an example, the word snowflake in one usage is innocent (e.g., A snowflake is falling from the sky). It can also be insulting (e.g., You’re a freaking snowflake). The AI can understand context, but not right away.”

To minimize mistakes, Oterlu completed a series of offline analyses before the technology was implemented. Random samples were taken from the chat and the messages flagged by the model to check what, if anything, was slipping through or rejected in error.

While the model was customized, the intervention was being engineered by Peppy Agency in collaboration with the then Director of Community, Content and Channels, Jane Billet, and her team. To formulate a non-judging message encouraging a change in behavior, they drew on behavioral psychology and research on how moderators can talk about online bullying without identifying behaviors as wrong or bad. Like the AI model, this sensitive approach saw us push past simplistic binaries.

Ultimately, flagged users were served with this message:

"Hey! It seems you've sent an unkind message. Stop and think: How would you feel if someone said that to you? Remember, being a good friend can also get you lots of new friends!"

When dissected, it’s possible to see each of the three-steps Peppy Agency developed for effective behavior change online:

  1. Help the player identify the behavior SSO wants to change (e.g., “Hey! It seems you've sent an unkind message.”)
  2. Encourage empathy by allowing the player to reflect how that behavior would feel (e.g.,“Stop and think: How would you feel if someone said that to you?”)
  3. Offer support and examples of more positive behavior (e.g., “Remember, being a good friend can also get you lots of new friends!")

With additional input from stakeholders in the Community Engagement team and our backend teams, the pilot went live on UK servers.

Results too good to ignore

To make the results statistically sound, the project was set up as an A/B test. Half the users flagged for writing problem texts were served the carefully calibrated message. This was the treatment group. The other half, the control group, received no intervention, though their chat messages were detected, recorded, and censored. Oterlu measured the toxicity in both cohorts throughout four weeks in real-time.

The impact was obvious almost from the outset. More users in the control group were sending more toxic messages more of the time. By the end of the pilot, toxicity in the treatment group was reduced by five percent. Had the project continued, Oterlu predicts the rate “would only have continued to drop.”

While five percent may seem modest, scale is important. When you’re talking about thousands of messages, a reduction of five percent could bring serious benefits to the welfare of a community. Based on historical data, this result would translate to a reduction of over 5000 toxic messages annually in the SSO chat, for example. When you consider the scale of toxicity on other game platforms, five percent is too significant to ignore.

For Oterlu, the implications for the games industry are clear. “It says there is a better way,” Alexander concludes. “Messaging can be an effective way to reduce toxicity and abusive behavior. With the help of technology, this approach is a powerful way to make your community safer and more welcoming.”

“Look at the big picture,” Jane offers optimistically. “Imagine what might happen if more social communities use nudging rather than policing to achieve results like this. Where could this technology take us?”

Pilot assessment

Time to rewrite the rulebook

To understand why in-game moderation must evolve, imagine a kid’s online game platform as a real-world playground. Looked at this way, common safety measures look “pretty dystopian,” according to Alexander. He adds: “Shutting someone out of your platform is like dragging a child away and putting them in a room all alone. Banning them and censoring them is like putting them in a muzzle.”

This stark image doesn’t exaggerate the position game platforms have in the lives of young players. Digital spaces like SSO are playgrounds where children make friends and develop meaningful relationships. “We work on the premise that this is a playground,” Paulina offers.

“The studio makes it, but it’s owned by the children because they have created that community through the friendships they found there.”

As the site of interpersonal connection and the emotional investment that inspires, game platforms play a significant role in players' lives. We know this to be true of the SSO community. For ten years, players have relied on the game as a space to connect with others, often more successfully than they do in their real lives. The chat has a critical role in that. There, players seek exchanges about issues like mental health, identity, family, school, and life’s big questions. This was one of Jane Skullman’s motivations for progressing her vision for the chat:

“Policing measures created a lot of frustration within our player community because it didn’t let them talk about the things they wanted to talk about. We’ll comply with all regulations, but beyond that, who are we to stop players from talking about what matters to them? And just like in the schoolyard, they will talk about those topics. Our job is to protect them, not cut them off from important conversations with friends.”

The risk of outdated moderation solutions is clear, but so is the suggestion that game platforms are uniquely well-placed to support social and emotional learning for players in their formative years. “We’re using their playground and supporting them in a world they’re already in,” Paulina explains, “not forcing them to come into settings that have the feeling of being in the control of adults.” She continues:

“In a digital space like Star Stable Online, you have more opportunities to use play and storytelling for social and emotional learning moments that are on the children’s terms, which makes the child feel part of something joyful and relevant instead of educational. A classroom can be made to feel joyful, too, but when you’re in a game world, everything is about fun and friendship. Since that’s your starting point, the learning process will automatically be more engaging and relevant for the child.”

The facility to deliver personalized and responsive technology gives games another advantage in promoting positive behavior. “Social and emotional skill-building in a game world can happen at the exact moment these feelings are expressed in hurtful behaviors,”  Paulina advises. “In the school environment, for example, a child might be alone with a negative experience because the behavior didn’t flag our attention. In the digital space, it's detected and addressed as it occurs - when it matters most.”

In a film created to mark our tenth anniversary, we claim SSO is more than a game. In part, this is what we meant - it’s where girls can find themselves through enriching relationships that shape their self-image and their view of the world.

This isn’t about gender

SSO players characterize our community as welcoming, creative, and passionate, a reflection of the low level of toxicity observed in our chats. But the character of our community isn’t an accident, it’s by design. Gender stereotypes make it tempting to believe that a game for girls would naturally harbor low levels of toxicity, a mistake that risks minimizing the result of the pilot and what it represents for other studios. “Our community is very much co-created,” Cecilia highlights. “But the myth that girls are always nice hounds us. For teenage girls, emotions run high, so rules are broken. Moderation is very much a necessity.”

The stereotype that our audience should be nice presents another risk. As the creator of products for girls and young women, our desire to nurture kindness and inclusivity might align us with gender normative expectations. Instead, the pilot is part of broader efforts to facilitate personal and interpersonal development. “That’s not the same as always being nice,” Cecilia clarifies. “People being nice to each other all the time isn’t real life, and we wouldn’t be serving our players if that’s what we tried to do. That’s why we must move away from simplistic moderation. We need to tackle negative intent.”

Our interest in next generation games moderation is widespread in the industry, but our market position and dominant player demographic are unique, so it’s important to stress the risks of a gendered reading. Further, the pilot answers compelling calls for tech companies to do more to foster friendly online communities, not least from players themselves. To ignore the implications of the pilot ignores this call.

Considerations moving forward

SSO moved the needle of reducing toxicity in games with an innovative pilot. The results have far-reaching implications for the games industry. But what comes next for us?

Developing the pilot invites new challenges. When it was devised, Cecilia recalls,

“We were aligned that this project was being undertaken in the interest of player safety and the culture we’re building in Star Stable Online.”

Expanding the use of AI and interventions would mean revisiting earlier questions, including concerns around the complexities of machine learning, the ways data would be used, and how to prioritize this initiative in terms of business objectives.

We look forward to further exploring next generation moderation and advocating for the uptake of behavioral interventions elsewhere in the industry toward a more positive gaming community globally.


About us Star Stable

Star Stable Entertainment is a cross-channel entertainment company that makes games and stories where girls discover adventure, ignite creativity and build friendships.