top of page

  Improving Online Gaming  

OVERVIEW

Online gaming is full of people, and people aren’t always very nice. In the heat of an exciting game, players sometimes erupt into verbal abuse, racism, sexism, and even death threats. The toxicity of the online environment has been implicated in depression, anxiety, physical violence, and suicide.

 

Duration: 6 weeks

Team: Carl Klutzke, Phani Teja Nallam, Evi Odioko

Pain points:

1. TOXICITY:  How can toxic behavior be mitigated?

2. LEARNING:  How can we help people learn new games?

3. MAKING FRIENDS:   How can we help gamers find other good people to play with?

Solution:

Our design solution is to have a reputation system where all gamers have a common ID that is used across all the gaming platforms. With this, any action taken against a player affects their rating. This will subsequently help players to choose who they’ll like to play with based on the rating shown. Player ratings will be affected based on the severity of the offense. Gamers will not want to play with other gamers who have a bad rating because they were always foul and toxic during gameplay. This can cause an effective change in the behavior of the toxic players who have poor ratings, they’ll have to redeem their standings by being well behaved during gameplay, to avoid being reported by other gamers.

THE PROBLEM

How pervasive toxic behaviors are & what can be done about them?

(in the context of online multiplayer games)

Data Collection
THE PROCESS
1. EMPATHIZE 

Qualitative Research

Interviews

Field Research

Observations

2. DEFINE

Affinity Diagram

User Journey

Persona

3. IDEATE

Concept Sketching

Wireframing

4. PROTOTYPE

Low Fidelity

High Fidelity

5. TEST

Cognitive Walkthrough

Heuristic Analysis

Think Aloud

DATA COLLECTION

Observations & Interviews

Part of our observations were conducted online: we observed recorded sessions of play and examined existing resources and research. Our other observations were conducted at the gaming center and a public eSports facility.​

​​

  • If you don’t have local friends to play with, starting a new game can be like entering a harsh new world all alone.

 

  • There is no bright line between toxic behavior and acceptable behavior. “Trash talk” is fun for some players but not for others. 

  • New players (“noobs”) attract abuse because they slow down veteran gamers.

NEW PLAYERS = SHIT

observations.png

HELP = LOCAL FRIENDS

TRASH TALK IS THE NEW BLACK

PROBLEM FRAMING

Affinity Diagram

Affinity Mapping

AFFINITY WHITE-min.png
PROBLEM FRAMING

Concept Mapping

Concept map 8.png

The concept map felt a bit similar to affinity diagramming, because we were identifying concepts in the domain of study, but they provided different benefits. The affinity diagram helped us to identify important ideas and problems in the domain. The concept map helped us to see all the moving parts in the domain and how they are related to each other: we will need this information in order to provide any sort of solution to a problem.

Persona
PERSONA 

The Victim 

Character design by  Anastacia Loginova

The Bully

Character design by  Anastacia Loginova

persona 1_9.png
Storyboarding
BRAINSTORMING 
STORY-BOARDING
presentation story board 1Artboard 3.png

Argus tries fake id

Matilda reports a player

Design Solution
DESIGN SOLUTION

After brainstorming on solutions and drawing storyboards for some scenarios to select a design solution. We combined two of the scenarios into the idea of a central gamer reputation system, driven largely by the following improvements to toxicity reporting:

  • In-game reporting is as fast and easy as muting.

  • The system monitors gameplay and communication channels for events that may indicate toxic activity, and automatically includes a log of these events to objectively substantiate reporting.

  • Reports appear in the central gamer ID hub system and affect player reputation immediately.

  • Reported players are immediately muted and blocked from your account so no game will match you with them in the future.

Game Site 1

Game Site 2

CENTRAL HUB

Access

Data

Monitoring

GAME

Game Site 3

Game Site 4

USER JOURNEY

In-game Process

Customer Journey
PROTOTYPING

Accepting a request

A central hub for gamers that includes, linking Identification verification for every gamer, thereby rating other gamers based on their personal behavior and allowing age restricted gaming.

In-game Reporting

It involves muting other players, blacklisting toxic players, quick reporting and verifying, automated transcript monitor for voice communications that shuts down the mic if too many violations or complaints.

Check Profile and Reputation

If someone reports a gamer for being toxic it is affected in the rating system, this is just like Uber passengers rating drivers.

EVALUATION & NEXT STEPS

Cognitive Walkthrough    →    Heuristic Analysis     →    Think Aloud Exercise

PROBLEMS: Our evaluations revealed a few problems with the prototype, of which the following were most significant:

 

  • Realistic Placeholders : We duplicated some placeholder images for quick construction, which didn’t look very realistic (“Why am I friends with Andrew five times?”) and distracted the testers. We would use more realistic placeholders in the future.

  • Similar profiles : The two player profile pages looked very similar, and testers were sometimes confused whether they were looking at their own page or not. Our intro script should have identified the tester as “John Smith”, and we should have provided distinct images for the two players.

  • Realistic Names : We used the ID “Troll_247” for our troublesome player, but this unrealistic name also distracted our testers and is not a real name that should appear in our central ID hub.

  • Terminology : We used the terms “blocked players” and “blacklist” interchangeably instead of using one term consistently.

SUGGESTIONS: Our evaluators suggested some additional features:

 

  • Recent Players: The Central ID Hub should provide a list of people you played with recently, so you could find them and rate them whenever you wish. This would be an important driver of the reputation system, in addition to the toxicity reporting.

  • Different Rating Types: The reputation system should separate ratings of the player’s skill from ratings of their amiability/personality. Some users won’t care how crass a player is if he helps their team win. Other users won’t care how skilled a player is if he’s unpleasant to play with.

DOUBTS: Our evaluators expressed some doubts about the feasibility of the system:

 

  • Game Performance: Would they really be able to file a report quickly without affecting their performance in the current game?

  • Effective Results: Would the results of the reporting be immediate and significant?

  • Privacy: Would potential system users consider registration with government IDs too great a violation of their privacy?

  • Cross-Nation: Would the system be able to work with government IDs from different states and nations?

NEXT STEPS: If the project continued, our next prototypes should be more detailed and interactive:

  • A more complete interactive mockup of the Central ID Hub website.

  • A minimal implementation of the reporting features into a simple but playable game.

  • We would demonstrate these prototypes as proof of concept to generate industry support.

bottom of page