It was recently revealed that Facebook lost control of 50 million users private data and it ended up in the hands of a sleazy political data analytics company. Then Mark Zuckerberg took five days to acknowledge the scandal. Now, Zuck is on a media tour and he’s refining his ruminative, valueless answers by the day.
It’s been a little over two weeks since Facebook announced it had suspended Cambridge Analytica for its illicit acquisition of the social network’s user data. Facebook has been aware of the infraction since 2015, but it trusted the data analytics firm to delete that data when it was given a warning. It turns out you shouldn’t just trust profit-driven companies to do the right thing.
The ensuing public fallout has been remarkably consequential for the social media giant. Zuckerberg is wanted for questioning by numerous legislators around the world, brands are deleting their Facebook pages, close to $100 billion was wiped from its market cap, plans to sell hardware have been scuttled, and the Facebook brand is considerably damaged.
Since he broke his silence with a non-apology, Zuckerberg has given numerous interviews that raise more questions than they answer. He’s iterating on his responses and getting more comfortable with vaguely explaining his positions that essentially amount to maintaining the status quo with a few tweaks. His latest interview with Vox’s Ezra Klein brought us more of the same with a few surprises.
In a recent interview with Recode, Zuckerberg said, “What I would really like to do is find a way to get our policies set in a way that reflects the values of the community, so I am not the one making those decisions.” He acknowledged his fundamental discomfort with “sitting here in California in an office making content policy decisions for people around the world.”
In Monday’s interview with Vox, he went a step further in his thought process when it comes to moderating content:
Right now, if you post something on Facebook and someone reports it and our community operations and review team looks at it and decides that it needs to get taken down, there’s not really a way to appeal that. I think in any kind of good-functioning democratic system, there needs to be a way to appeal. And I think we can build that internally as a first step.
So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.
Zuckerberg leans into the analogy that Facebook is more like a government now than a traditional company. But he rotates between whether this pseudo-government is more of a democracy or a dictatorship. In the same interview, he points out that his unique control over the publicly traded company means his decisions “are not at the whims of short-term shareholders.” That’s true, Facebook’s moves are instead at the whims of one 33-year-old who wants absolute power except when it comes to making decisions that aren’t fun. He proposes an independent appeals board. While it’s unclear if he means a series of boards around the world or one centrally located but independent Facebook People’s Court. Who will decide the membership of this appeals board? Well, one man is in control of the company. Maybe he’ll hold an election.
“Facebook’s ultimate product is you,” has become a well-worn explanation for the ways in which it monetizes your private data to sell targeted advertising. Last week, Apple CEO Tim Cook came out swinging against Facebook saying, “The ability of anyone to know what you’ve been browsing about for years, who your contacts are, who their contacts are, things you like and dislike and every intimate detail of your life—from my own point of view it shouldn’t exist.” Cook argued that the advertising and data-monetization model is fundamentally flawed and sets a company up to betray its customers.
Zuckerberg has always maintained that advertising revenue keeps his service free and is, therefore, the most egalitarian approach. This philosophy saw its most public pushback when India rejected Facebook’s “free basic internet” offering as “digital colonialism.” Speaking with Vox, Zuckerberg got a little testy when Cook’s comments were mentioned:
You know, I find that argument, that if you’re not paying that somehow we can’t care about you, to be extremely glib and not at all aligned with the truth. The reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people who can’t afford to pay. And therefore, as with a lot of media, having an advertising-supported model is the only rational model that can support building this service to reach people.
I don’t think at all that that means that we don’t care about people. To the contrary, I think it’s important that we don’t all get Stockholm syndrome and let the companies that work hard to charge you more convince you that they actually care more about you. Because that sounds ridiculous to me.
Going after Apple for charging high prices is a solid debate club tactic, but it avoids the issue and misrepresents Cook’s point. The Apple CEO wasn’t necessarily saying that his companies model of selling a product directly to a customer is more virtuous, rather it’s simply more benign. He pointed out that the approach of surveilling users and selling their data in the shadows made it inevitable that “something would occur and people would be incredibly offended by what had been done without them being aware of it.” Zuckerberg is just ignoring reality.
Zuckerberg explained on Monday that it’s a misconception that Facebook only rewards content that gets lots of clicks, likes, and shares in its news feed. “Meaningful content” is his preferred buzzword for what his algorithms are trained to surface. He said that Facebook uses focus groups to learn what users value most in their stream. The company has determined that he’s found what drives people’s “well-being” splits into two categories:
One is where people are connecting and building relationships, even if it’s subtle, even if it’s just I post a photo and someone I haven’t talked to in a while comments. That person is reminding me that they care about me.
The other part of the use is basically content consumption. So that’s watching videos, reading news, passively consuming content in a way where you’re not actually interacting with anyone or building a relationship. And what we find is that the things that are about interacting with people and building relationships end up being correlated with all of the measures of long-term well-being that you’d expect, whereas the things that are primarily just about content consumption, even if they’re informative or entertaining and people say they like them, are not as correlated with long-term measures of well-being.
So this is another shift we’ve made in News Feed and our systems this year. We’re prioritizing showing more content from your friends and family first, so that way you’ll be more likely to have interactions that are meaningful to you and that more of the time you’re spending is building those relationships.
Facebook has made this exact same claim in April 2015 and June 2016. Is anything getting better? One way Zuckerberg could remove himself from this decision-making (an algorithm design is a decision) is just to go back to the linear timeline. But that’s not a notion that appeals to Zuckerberg’s approach to advertising, and it undercuts Facebook’s desire to be an addictive click-machine that always stimulates you into spending more time on the platform.
Oh, wait, Klein never really pressed Zuckerberg on the privacy concerns that kicked off this whole cycle of controversy in the first place. The discussion did talk around Facebook’s two biggest issues: Scale and transparency. But Zuckerberg never really got into how those factors affect your privacy. Two billion users across the world give up their data to Facebook and it has consistently abused the trust of those users. In the case of Cambridge Analytica, the social network let a big chunk of that data be weaponized in the 2016 election of Donald Trump and the UK Brexit referendum.
Zuckerberg told Klein that he realizes Facebook hasn’t been transparent enough about the prevalence of that type of issue. “We haven’t done a good job of publishing and being transparent about the prevalence of those kinds of issues, and the work that we’re doing and the trends of how we’re driving those things down over time,” he said. But he barely mentioned this example of violating user privacy, and no others. We don’t know how many other third-party developers sucked up Facebook data through apps and put it on the open market. And a thorough accounting would likely leave the world slack-jawed.
The fact is, Facebook probably doesn’t have anything close to an accurate figure on how its platform has been abused. It’s employing thousands of people to algorithmically monitor its content because as financially successful as it is, it can’t handle its scale any other way. Zuckerberg doesn’t give us any concrete numbers on his claims that the work he’s doing has been driving down objectionable trends. He never gives a walk-through of how the Facebook algorithm makes a decision because that information is worth billions. All the while, the platform has scaled in use so quickly that even if you don’t want to participate anymore, you feel an obligation to keep an account in case it’s needed. Ol’ Zuck doesn’t really have an answer to these problems. But as he said when addressing Facebook being used to organize ethnic cleansing in Myanmar: “it’s a constant challenge.”