Substack, best known as a subscription newsletter platform, just launched a copycat Twitter competitor called Substack Notes. Elon Musk wasn’t pleased and apparently took a number of steps to block Substack links on Twitter. Substack CEO Christ Best went on the Verge’s Decoder podcast to discuss his company’s rocky entry into the world of social media. When the conversation turned to content moderation, Best made a surprising decision: He refused to take a stance on overt racism.
Best was interviewed by Nilay Patel, editor-in-chief of the Verge and host of Decoder. “You have to figure out, ‘Should we allow overt racism on Substack Notes?’ You have to figure that out,” Patel said.
“No, I’m not going to engage in speculation or specific ‘would you allow this or that, content,” Best said.
Substack is a platform with over 500,000 paying subscribers. Normally, CEOs of companies are well prepared for this kind of question and others that are far more complicated. In Substack’s case, controversy over its content is probably the biggest thing it’s known for. But as Best went back and forth with Patel, he refused to take a stand on how his company would handle a post such as “all brown people are animals and they shouldn’t be allowed in America.” An incredulous Patel gave Best several opportunities to recover from his train wreck answers—opportunities that Best turned down.
“You know this is a very bad response to this question, right? You’re aware that you’ve blundered into this. You should just say no. And I’m wondering what’s keeping you from just saying no,” Patel said.
“I have a blanket [policy that] I don’t think it’s useful to get into ‘would you allow this or that thing on Substack,’” Best said.
Substack did not immediately respond to a request for comment.
Racism is bad, but the American right wing has spent years turning that simple statement of fact into a cultural flashpoint. Anyone who dares state the obvious truth about hate and discrimination risks a coordinated attack from a relatively small but politically significant mob of anti-woke justice warriors. That may explain Best’s refusal to engage with the idea that his company should, perhaps, take steps to avoid promoting and profiting from racism.
This particular strategy is probably not the best approach. And one of the many reasons it’s too bad that Best picked this particular racist hill to die on is it overshadowed a number of other interesting questions facing Substack.
Shortly after news broke about Substack’s Twitter competitor, Elon Musk’s company took action. For a moment, it was impossible to search for the word “Substack” on Twitter. Users couldn’t like or retweet posts that contained Substack links, and Twitter even marked them unsafe, warning users who clicked that “the link you are trying to access has been identified by Twitter or our partners as being potentially spammy or unsafe.”
Twitter’s actions against Substack came to light over Elon Musk’s fallout with journalist-turned-propagandist Matt Taibbi, one of the writers responsible for the Twitter Files—a self-important episode where Musk selectively leaked internal documents about his own company. Taibbi complained that Twitter was blocking links to Substack, where he makes his living. Musk responded by claiming that the links were never blocked (they were) and that Substack was trying to steal information from the “Twitter database,” whatever that means. Taibbi then left Twitter for Donald Trump’s Truth Social.
In the Decoder interview, Best denied that Substack was trying to download a massive portion of Twitter. “It’s one of several claims that got bandied around during this time. It’s not true,” Best said. He even went on to claim that Substack Notes isn’t intended to be a Twitter competitor, which is odd, considering that it’s obviously a Twitter competitor. Look at this Gif Substack created for Notes and tell me if it reminds you of any social media platforms:
The whole “I don’t want to take a stand on banning racism” debacle isn’t the first time Substack sparked controversy over content moderation. In 2022, the Center for Countering Digital Hate estimated that Substack makes $2.5 million a year from content that promotes dangerous misinformation about vaccines. Substack responded with a blog post arguing that the platform should do as little content moderation as possible.
“We make decisions based on principles not PR, we will defend free expression, and we will stick to our hands-off approach to content moderation,” Best and his co-founders wrote in the blog post. “While we have content guidelines that allow us to protect the platform at the extremes, we will always view censorship as a last resort, because we believe open discourse is better for writers and better for society.”
As the Verge’s Patel pointed out, Substack’s newsletter product is more of a service provider like Gmail or even a phone company, the kind of service where the general American consensus favors free expression over company interference. Most people don’t want Google deciding what you can and can’t say in an email, with few exceptions. But with Substack Notes, the company is dealing with a social media product where anyone can make a post that other people will see. That’s a lot different from a newsletter people have specifically asked to read.
On a service that’s totally neutral like a phone line or a physical bulletin board in a public space, you can argue that all ideas should be allowed to rise and fall on their own merits. An app like Twitter or Instagram doesn’t work like that. These aren’t neutral platforms, there are algorithms dictating the content users are exposed to. Despite what tech CEOs will tell you, that means that social media companies are making editorial decisions. A lot of people think that means social media companies should take more responsibility to limit dangerous ideas.
Substack’s CEO may or may not be one of those people. It’s hard to know because he doesn’t want to tell you.