Watching Synthetic Messenger is a somewhat dissociative experience. It operates in a Zoom call with 100 participants, all of whom are bots. Observers can watch these bots—which are strangely anthropomorphized with images of disembodied hands and voices that say “scroll” and “click” repeatedly—methodically scroll through news articles about climate change and click every ad on each page.
The project, created by two New York artist-cum-engineers, launched earlier this month. In its first week and a half online, its bots visited 2 million climate articles—you can see them listed here—and clicked on 6 million ads.
If this all seems like a bizarre, trippy art project, it definitely is. But it’s also a piece of criticism about how narratives about the climate crisis are shaped by the media.
Most online outlets are funded by advertisers. Stories that garner more ad clicks can also become more visible in Google’s search algorithms, drawing more eyes to the page. When certain stories garner more views and engagement, news organizations are more likely to publish similar articles. Absurdly, this means advertising mechanisms and algorithms can play an outsized role in determining what news people see rather than other factors like, um, how important the story is.
“With this project, we wanted to see how that media ecology affects our actual ecology, how narrative affects our material realm,” Sam Lavigne, an artist and assistant professor in the Department of Design at the University of Texas, said.
Of course, conflicting narratives have always played a role in the climate crisis, as Lavigne was quick to note. Polluters know that controlling how people talk and think about the climate crisis is important, so they’ve spent fortunes on all sorts of misinformation campaigns, including on shaping narratives in media.
“The narrative around climate change has been so controlled by the fossil fuel industry and lobby groups,” Lavigne said.
Algorthims have further distorted how news—or, increasingly, misinformation—reaches people. YouTube’s algorithm for recommending videos, for instance, has encouraged viewers to watch videos full of climate denial. YouTube also sold against those videos, profiting off misinformation while incentivizing viewers to consume ever-more of it.
As historically damaging wildfires spread across Australia a year-and-a-half ago, a narrative sprung up that they were sparked by arsonists, not by the climate crisis. That misinformation, a group of researchers found, was spread with the use of online trolling bots. Conservative media then turned around and amplified those claims, creating a feedback loop where everyone was debunking lies rather than talking about how to address the climate crisis. (The same scenario played out in the U.S. last year.) Yet as Tega Brain, who co-created the project, noted, these aren’t the only ways that algorithms have colored the media landscape.
“All news, and therefore all public opinion is being shaped [by] algorithms,” Brain, an assistant professor of digital media at New York University whose background is in environmental engineering, said. “And the algorithmic systems that shape news are these blackbox algorithms,” she added, referring to tech companies’ practice of hiding how their code and priorities from the public.
Synthetic Messenger, then, looks to game the system by showing bot-fed interest in climate stories. While it could play a small role in amplifying climate coverage, there are some complications. For one, since its algorithm is imprecise and based on climate-related keywords, it also clicks ads on climate-denying media. Its creators have tried to get around that by blacklisting denialist websites like those owned by Rupert Murdoch, but it’s not a perfect system.
If this project were primarily designed as a tool for political organizing, those might be big sticking points. But Brain and Lavigne are clear that they know their project won’t change the media landscape or fight the climate crisis itself.
“We don’t intend for it to be read as like, ‘here is this really effective new activist strategy to deal with climate change,’” said Brain. “Essentially, with this project we’re doing what’s called ‘click fraud,’ and if we did it for a long enough time and at a large enough scale, it wouldn’t work, because obviously ad networks are doing everything they can to sort of protect against automated behavior. They’d stop it.”
Rather, the purpose is to call attention to the screwed-up incentive structures that determine what climate stories get told and amplified by advertisers and search algorithms.
“It’s not like we are offering this as a solution to this problem that we have. The solution is meaningful climate policy, effective policy,” said Brain. “But we’re trying to open up a conversation and reveal the way that our media landscape is currently operating.”