How Video Games Do Feedback Well (and Poorly)
I sometimes get asked why video games are so popular. Part of my answer is that we love getting good feedback about our performance because it helps track our mastery of skills and progress towards goals. Not only do we love getting this kind of feedback, but we typically don’t get enough of it in our jobs or schools.
At work, feedback comes from things like performance reviews, customer surveys, and sales reports. At school we get grades on things weeks after we turn them in, or longer. In either case, we often have to deal with performance feedback that’s delayed, vague, of dubious origins, or otherwise not ideal. And that’s if we get it at all. A lot of people only get feedback when they screw up.
So we turn to video games, which scratch this psychological itch far more effectively.
This is because video games are engineered to give us really good feedback about our performance, which lets us adjust our strategies, change our behaviors, and reach our in-game goals. Specifically, performance feedback in video games is typically:
- Immediate, not delayed
- Frequent, not intermittent
- Focused on outcomes, not people’s identities
- A mix of positive and negative
- Useful for showing progress towards goals (think progress bars)
These are all things that psychologists studying performance feedback have found that we love and which leads to using the feedback to improve. So, I guess feedback in video games is wholly superior to workplace feedback, eh? Guess that’s why all them kids are playing the Fortnite instead of pursuing their careers.
Actually, no. The feedback we get it games is typically missing a critical component.
I’ve been reading about what organizational psychologists know about effective performance feedback. While all of those things above are indeed characteristics of effective feedback, there’s one major feature of effective feedback that video games rarely, if ever, deliver:
Focus on process, not just results.
Let me explain. It’s fine to get feedback that lets you know outcomes –you hit your sales numbers, you killed the bajeezus out of that goblin, and so on. You need feedback about outcomes, in fact. But really effective performance feedback also provides information about the process, strategies, and specific behaviors you used to get there, so that you can know exactly why you succeeded or failed. Games are bad at this. The closest they usually get are data dumps without much insight or analysis.
At the end of a game of Heroes of the Storm, for example, the game may tell you your team lost and give you some stats. But it won’t say “Hey, when playing as Sonya you should have spent more time taking merc camps and waveclearing, then jumped into teamfights after level 10.” Video games just don’t do this kind of feedback, because it requires very human-like thinking and expertise. Thus the popularity of video guides and the burgeoning market for video game coaches.
But I have found one third party tool that automates and approaches the kind of process feedback you would get from a good coach or manager.
World of Warcraft (WoW), as you may know, is a massively multiplayer game where players team up to tackle challenges like big boss battles. Especially at high level play, WoW offers many different approaches to this task and many different roles for players to fill. Each player on such a raid must not only properly equip and prepare themself, but they have to perform well when it’s time to sling spells or swing swords. You can perform well, or you can perform poorly. WoWAnalyzer helps you figure out how to improve.
The tool was originally created by Dutch software developer Martin Hols. He had been playing WoW and even programming add-ons for it for years, but in 2016 Hols became serious about improving his Holy Paladin spec. So he joined a guild and got to work. “As a part of my desire to improve I joined the (Holy) Paladin Discord server,” Hols told me via e-mail. “After observing for a bit, I joined the conversation and quickly became a regular there. I started doing some basic analysis using Warcraft Logs (a site that allows you to view logs of boss fights you did) and this slowly got more and more advanced.”
Hols soon noticed that manual review of combat logs was impractical but that a lot of the information was suitable for automation. One could feed these logs into a program and have it return specific feedback. Seeing it as an interesting challenge, he eventually created a proof of concept for a “Holy Paladin Mastery Effectiveness Calculator.” His work quickly gained attention and eventually expanded to include feedback for dozens of specs, equipment, and classes. To reflect this broader application, Hols swapped out the project’s somewhat clunky name along the way for the much more direct “WoWAnalyzer.”
What makes WoWAnalyzer impressive to me is how it provides the kind of process feedback that video games usually neglect. It looks at your combat logs and then gives you very specific feedback and recommendations on what to stop doing and what to start doing more. A report for a Fire Mage build, for example, might look at her recent performance and note “You cast Fireball instead of Scorch when the target is under 30% health 11 times. When using Searing Touch always use Scorch instead of Fireball when the target is under 30% health since Scorch does 150% damage and is guaranteed to crit.” And when players did something well, they were congratulated for it.
The logic and content of these reports are built by dozens of contributors who know different player specs inside and out. To help make feedback reports consistently useful, Hols and his team put together report writing guidelines that recommend what he describes as “concise suggestions that allow users to quickly understand what potential issues and changes they need to make to improve.” Suggestions should:
- Explain what was found
- Make a suggestion that is future oriented
- Explain why the suggestion is important
- Suggest a specific, better behavior to take as an alternative to what the player did
In this way, WoWAnalyzer uses what psychologists studying feedback call “feed forward.” This very useful and effective feedback technique provides information about how to do better next time instead of focusing on what has been done in the past. It’s a subtle shift in focus, but people tend to react well to this kind of feed forward approach.
But this kind of analysis and feedback on performance is really hard to do and doesn’t fit all kinds of games. In general, video games just aren’t equipped to provide it. But maybe that will change. Maybe advances in artificial intelligence will get to the point where a virtual coach can examine your performance in real time and give you feedback and feedforward after every match, play, or level.
Maybe it could tell you something like “You kept trying to use the shotgun at medium to long range, resulting in an accuracy of only 22%; try switching to a rifle for longer range engagements.” Or “Your team’s tank is susceptible to physical damage and died more often than normal; if you want to be an effective support choose the Iron Clad ability so that you can grant additional armor.”
Stats, data, and logs are great, but something that helps players make sense of them and apply lessons would be extremely appealing.