In October, 29,000 neuroscientists gathered in Chicago to discuss new research in their sprawling field at the Society for Neuroscience’s annual meeting. Amid mountains of abstracts on every conceivable aspect of brain science, there were a surprising number of studies about an unlikely subject: video games.
Plenty of pop cultural bandwidth has been devoted to showing how video games harm us–from ostensibly making us less social to making us more violent–there’s been remarkably little scientific study of whether they do anything good for us. “The probable negative effects of video game playing are well discussed in the media,” wrote one presenter, Sabrina Schenk, in her abstract. “But the positive effects are almost completely neglected.”
That’s changing. Because not only are video games increasingly diverse and played by more people, they’re also a fantastic controlled simulation of real-world tasks. That makes them perfect for scientists who want to study the complex neurological mechanisms at work while we play, say, Rise of Nations.
Some of these studies look at how typical gamers compare to non-gamers on cognition tasks, while others look at whether non-gamers benefit neurologically when they start playing video games. For example Schenk, a PhD student at the Institute of Cognitive Neuroscience at Ruhr University Bochum, studies how people who play video games might actually be much better at some tasks than those who don’t.
In an experiment she presented at the conference, Shenk asked fifteen “gamers” (people who played more than 20 hours per week) and “non-gamers” to complete a common puzzle designed to test a person’s so-called “probabilistic” learning abilities. As the participants worked, Schenk imaged their brains with an MRI machine. Not only did the gamer group do a lot better at the task, they used a more complex “multi-cue” strategy to complete it.
Playing World of Warcraft at the BlizzCon in 2015. AP Photo/Jae C. Hong.
People who didn’t play video games, meanwhile, usually ended up relying on a single cue.Schenk also told me that the brains of the gamer group showed some unique activations during the task. Gamers exhibited more activity in the frontal cortex and hippocampus, which are associated with learning and memory-formation, as well as the posterior cingulate cortex and the precuneus, often associated with episodic memory and spatial learning.
What’s so intriguing about these types of findings isn’t that gamer brains light up in a unique way while they’re solving a puzzle. It’s that through training, video games might be able to teach anyone to think like a gamer and light up certain regions of their brain.
In another paper presented at the conference (and since published in the Journal of Neuroscience) Gregory Dane Clemenson, a Postdoctoral Fellow at University of California at Irvine, explored the idea of “environmental enrichment.”
Here’s a basic example: if you give a dog a more stimulating environment, like buying it new toys or making its kennel bigger, you also improve its hippocampal functioning and neuroplasticity. It’s a proven phenomenon for many animals, and the same idea may be true for humans: if we expose our brains to a broader range of spaces and richer experiences, we can improve our cognition and even slow its eventual decline.
Hippocampal neurons. Dr A.Irving / University of Dundee / Wellcome Images.
Clemenson and his co-author, Craig Stark, wanted to find out whether complex 3D video games could enrich our environments as much as actually exploring a new city or place. Imagine if an elderly bedridden person, unable to even go outside, could explore a 3D video game to reap the same cognitive benefits they would get from going on a walk or visiting a new place.
“Because of their engaging experiences and enriching 3D virtual environments, the same video games that have been played for decades by children and adults alike may actually provide our brain with meaningful stimulation,” Clemenson and Stark write.
At the conference in October, Clemenson explained how they’re testing this idea. Their study includes two basic experiments: One on self-described gamers, and another on people who don’t play. The first experiment divided gamers by the level of complexity in the games they choose to play: Tetris, Sonic the Hedgehog, and Zelda were all 2D examples, while 3D games ranged included Halo, Grand Theft Auto, and League of Legends (LOL).
yxxxx2003 on Flickr/Creative Commons.
All of these games have different versions of dimensionality, but the most complex versions, like LOL, let you actually move the camera away from the player to explore other parts of the virtual environment.
After classifying subjects by the complexity of their most-played games, they tested both memory skills and hippocampal functioning using a pattern separation task called the lure discrimination index, or LDI. They found that players who favored the more complex 3D games, like League of Legends, scored better at the hippocampus-dependent LDI task than those who preferred 2D games like Tetris.
Clemenson even replicated the effect on competitive gamers who are at the top in both 2D and 3D games–and sure enough, the more complex the virtual environment, the better their score on the LDI.
In Clemenson and Stark’s second experiment, they evaluated whether those same mental benefits can be imparted to people who don’t typically play games at all.
They recruited 69 non-gamers, and tested their memory skills and hippocampal functioning to get a baseline. Subjects then spent 30 minutes every day, for ten days straight, playing either Angry Birds (a 2D game) or Super Mario 3D World (a 3D game), while a third group played nothing. Clemenson and Stark kept testing the subjects’ memory during and after the 10-day period.
The group that had played Super Mario 3D World ended up showing improvement, while the Angry Birds players, and the passive control group, didn’t.
As Clemenson points out, the results pose as many new questions as they answer. For example, exploring that virtual 3D world on-screen might activate the same parts of the brain as exploring the real world–but will training in Super Mario 3D World make you better at creating and recalling real-world places? Clemenson calls this “translation,” and says demonstrating it will be crucial going forward.
“Ultimately, what we would really like to demonstrate is that learning to explore these virtual environments could help people learn, remember, and even explore real-world environments,” he wrote over email. “This would be a true translational effect.”
Right now, Clemenson and his colleagues are testing how gaming could help aging populations slow cognitive decline. Using games like Minecraft and Super Mario 3D World, they’re hoping to find out whether video games that give players a virtual version of “environment enrichment” can do just as much as the real thing.
Other scientists are studying similar ideas. Last year, Daphne Bavelier–a neuroscientist who in 2003 introduced the idea of video game-based learning in Nature and has led the field since–published a paper in the Annual Review of Neuroscience called Brain plasticity through the life span: learning to learn and action video games.
In the study, Bavelier argues that action video games, like Call of Duty or Medal of Honor, don’t just make gamers better at specific perception and cognitive tasks like improved vision, motion-tracking, and decision-making. Instead, she says, they actually teach gamers to learn. It’s a controversial idea that has spurred a discussion that will play out over the coming years.
Over the last three decades, the way humans live has radically changed. Rather than spending most of our time seeing and interacting with a physical world, we spent huge amounts of time interacting with screens–which often represent a virtual simulation of the real world. Thirty years isn’t very long, in science time, so questions about how these new behaviors affect us are only beginning to be thoroughly studied. But the field is quickly growing into one that could unlock more of gaming’s benefits—and pitfalls too.
“Video game playing is not only good or bad. It can be both and should be used moderately and wisely,” as Schenk and her co-authors put it. “Too much could be as bad as too little.”
Lead image: A scene from Paris Games Week in November, 2015. AP Photo/Francois Mori.
Contact the author at kelsey@Gizmodo.com.