“The narrative around climate change has been so controlled by the fossil fuel industry and lobby groups,” Lavigne said.

Algorthims have further distorted how news—or, increasingly, misinformation—reaches people. YouTube’s algorithm for recommending videos, for instance, has encouraged viewers to watch videos full of climate denial. YouTube also sold against those videos, profiting off misinformation while incentivizing viewers to consume ever-more of it.

As historically damaging wildfires spread across Australia a year-and-a-half ago, a narrative sprung up that they were sparked by arsonists, not by the climate crisis. That misinformation, a group of researchers found, was spread with the use of online trolling bots. Conservative media then turned around and amplified those claims, creating a feedback loop where everyone was debunking lies rather than talking about how to address the climate crisis. (The same scenario played out in the U.S. last year.) Yet as Tega Brain, who co-created the project, noted, these aren’t the only ways that algorithms have colored the media landscape.


“All news, and therefore all public opinion is being shaped [by] algorithms,” Brain, an assistant professor of digital media at New York University whose background is in environmental engineering, said. “And the algorithmic systems that shape news are these blackbox algorithms,” she added, referring to tech companies’ practice of hiding how their code and priorities from the public.

An alternate vision of the carbon cycle.
Gif: Tega Brain and Sam Lavigne (Other)

Synthetic Messenger, then, looks to game the system by showing bot-fed interest in climate stories. While it could play a small role in amplifying climate coverage, there are some complications. For one, since its algorithm is imprecise and based on climate-related keywords, it also clicks ads on climate-denying media. Its creators have tried to get around that by blacklisting denialist websites like those owned by Rupert Murdoch, but it’s not a perfect system.

If this project were primarily designed as a tool for political organizing, those might be big sticking points. But Brain and Lavigne are clear that they know their project won’t change the media landscape or fight the climate crisis itself.


“We don’t intend for it to be read as like, ‘here is this really effective new activist strategy to deal with climate change,’” said Brain. “Essentially, with this project we’re doing what’s called ‘click fraud,’ and if we did it for a long enough time and at a large enough scale, it wouldn’t work, because obviously ad networks are doing everything they can to sort of protect against automated behavior. They’d stop it.”

Rather, the purpose is to call attention to the screwed-up incentive structures that determine what climate stories get told and amplified by advertisers and search algorithms.


“It’s not like we are offering this as a solution to this problem that we have. The solution is meaningful climate policy, effective policy,” said Brain. “But we’re trying to open up a conversation and reveal the way that our media landscape is currently operating.”