Monday, December 13, 2021

Hack the Hackers: The Potential Power of Narrative to Stop Cheating in Video Games

I find nothing fun or fair about using hacks in multiplayer games. I don't condone it. Still, I view hackers showing off on livestreams with the same fascination as women who binge-watch murdershows. It's a passing interest that I've studied for a while, and I wonder if we're missing something in the effort to stop them. 

I found this neat quote by Johan Huizinga, who studied the importance of play in human culture. In calling cheating "anti-play," he said "as soon as the rules are transgressed the whole play world collapses." 

That seems to be true. Cheating in multiplayer games runs rampant in everything from World of Warcraft to World of Warships to New World. It causes unending rage across player communities, requires companies to spend millions of dollars annually to find, fix, and destroy hacker accounts, gives development teams migraines, and leads to lawsuits whenever players argue the ban-hammer was erroneously dropped on them. 

But all of this treats cheating like it's an arms race. The cheaters get a better hack, so now we need a more advanced system to detect it and thwart it. We are locked in a constant cycle of threat creation and threat response. I think there's a broader approach to this conflict; an opportunity to win hearts and minds.

Dan Ariely at Duke University has studied the mindset of cheating for years. If you have time, his TED talk on why we cheat is riveting. If you don't, here's his quick 13-point breakdown

Either way, here's what I find most interesting when we think about people who hack to gain an advantage over others in multiplayer. From the article:

"Increasing the reward gained by cheating or the penalty of being caught cheating did little to change the amount of cheating in the group."

"The more abstracted we are about the cheating, the more willing we are to cheat."

"If we feel like someone has cheated us, then we are more likely to cheat to get even."

"We cheat when we see others we associate with cheating."

"A nudge to remember our moral compass has a profound impact to reduce cheating if it comes before the opportunity to cheat presents itself."

In light of Ariely's findings, it seems like nuking player accounts is a doomed strategy. The Call of Duty community's response to the banning of 10,000 accounts at a time appears to back that up. It's all drop-in-the-bucket and we-haven't-seen-the-last-of-them stuff. The damage to the play world is compromised. If people experience loss at the hands of a cheater or watch a cheater using hacks on stream, the damage is done and the cancer is already spreading. This is a problem that demands a more proactive approach. But what? 

I focus on points 2 and 5 above. Can we reduce the "abstractness" of the game environment to the player? Can we shine a laser pointer at the player's own ethical code before they engage in the game?

I think we can.

The principle of "abstractness" relative to the game is a subject of study unto itself. You play for "credits" or "tokens" instead of actual money. Even if you pay money for the credits, Ariely finds that there's still a diminished sense of value and that you'll cheat to accumulate them. You could arguably say your K/D ratio in a multiplayer game becomes a kind of currency. If you can't track it, then you can't be driven by it. 

The biggest sense of abstractness lies in the game world itself. "That is not actually me cheating, and I am not actually harming other people, and the owners of these avatars who I shoot are nowhere near me and will never know my true identity or where I live." 

A player therefore has no sense of connection to their fellow players or the game world. It's the same problem as trying to get older generations to believe in climate change. "I'm not going to be here in fifty years, what do I care where the sea level is then?"

I believe narrative structures can be developed in games to make players care about the overall game environment. I mean, the whole purpose of narrative is to make the player feel immersed in the world, right? 

America's Army did something that felt a little experimental with the concept back in the day. They set up an honor system that unlocked "special forces character capabilities" if you played by the rules long enough. Notably, the fastest and roughest way to lose honor points was to kill a teammate. Even if that RPG hit them by accident, you lost 10 points (it took countless hours of play just to earn 5) and got locked out of the game for a few hours. It definitely made you think twice about pulling the trigger. 

I think the biggest thing that can be done is to repeat a practice Ariely tried in an experiment. In the experiment, Ariely told students in two separate classrooms to make it very obvious they cheated on an exercise. The difference was that one classroom cohort signed an honor code before the exercise, and the other didn't. The groups that signed the honor code consistently exhibited lower levels of cheating

Could a main character from a game's campaign ask a player to take 30 seconds to sign an honor code before jumping into multiplayer? Would that be such an inconvenience to the community that it kills the log-on rate? I can't imagine it being worse than what we are already experiencing. 

It sounds silly and corny and antithetical to the player experience, I know. Having the Master Chief telling you to raise your right hand and repeat the UNSC Academy Honor Code might be a stupid idea. Then again, if it's stupid but it works, it's not stupid. 

Ultimately, I find Ariely's argument compelling that keeping multiplayer spaces safe is more about giving a little TLC to our buggy moral code than it is about defeating malicious computer code. That means flipping the script on the player's internal monologue. The best way to hack that is with a new narrative that compels them to take care of their game environment. 



No comments:

Post a Comment