This is the second installment of the Living With Data series, exploring encounters with data in our everyday lives.
Do my clicks count?
Molly, a 23-year-old high school science teacher in Brooklyn, writes:
In recent years I have clicked with care due to awareness of how personal data is used. But I have also begun to try to leverage that awareness. For example, if I want an article about an issue or candidate I care about (like Bernie Sanders) to be promoted on a site, I “do my part” by clicking on it multiple times. Likewise, if I don’t want an article or issue promoted, I avoid clicking it—even if I am interested in reading it. For example, I avoid most stories about terrorists because I believe their promotion encourages other would-be terrorists, and I would hate for data from my clicks to contribute.
My thinking is that news sites and Facebook really do promote content based on what algorithms tell them people want to see. In my own tiny way, I try to trick these algorithms into suggesting that “people” want to see what I want them to see by portraying myself as an example of a person interested in clicking (or not clicking) particular content. I know the effect must be small but hey, I try…
Am I super weird? Does anybody else do this? Do you do this? Do people try to trick algorithms into promoting preferred content in online contexts?
When I spoke with Molly later, she described this tactic as “upclicking,” that is, clicking on articles multiple times from The Guardian’s homepage, or clicking through an article link on Facebook. “I would upclick articles about inequity in education or violence against black communities by police—things that my friends in particular would not look at on their own,” Molly told me.
She “upclicks” articles that she fears might otherwise be dismissed as “radical” if she were to share them. She thinks her friends would be more open to clicking on something that shows up in a sidebar as a related article than if it’s just “another one of Molly’s political posts.” She’s hoping that clicking will be a subtle form of data activism.
We are increasingly looking to Facebook, Twitter and other social platforms for news—news that’s filtered through social and recommendation algorithms like the recommended or trending articles that show up in your feed—so data impacts what news stories land in front of our eyeballs now more than ever.
You’re probably familiar with the concept of the filter bubble—the idea that algorithms keep readers coming back by showing more of the things they already like based on their past behavioral data, thus only serving up content already catering that matches their interests and beliefs.
Molly’s clicks are poking at the filter bubble, but are they strong enough to make it pop?
“I think that algorithms are really sensitive, in my imagined conception of them,” she says. “If it’s five more clicks, that would probably make a difference in promoting something or not.”
It’s a great theory, but does it actually work? The short answer is: It can’t hurt, but algorithms take into account many other metrics beyond clicks. How much each click counts, and how the various metrics are weighted against other factors, is still a mystery to most of us. That’s the secret sauce of media platforms and publishers, protected as competitive advantage and constantly refined to stop us from gaming the system. But there are some things we do know.
When Molly clicks five times on the same article, that’s registering as “pageviews,” which amounts to the same device reading the same article five times. Pageviews generally aren’t weighed as heavily by recommending algorithms as “unique views,” which signal that each click is coming from a different person.
To take her click tactics to the next level, Molly could click on the article on multiple devices and social platforms, generating more uniques. Unique views are usually tied to an identifier like an IP address or a cookie in your browser, so she could go one step further and use services like Tor to change her IP address to make each click look like a unique visit.
Beyond pageviews and uniques, there are other audience attention metrics that matter as inputs into recommendation algorithms. As publisher traffic tracking tool Chartbeat (of which Gizmodo is a customer) states on its website, “Your audience’s attention is worth more than their clicks.” Metrics like “time-on-site”—how long someone spends reading the article—are becoming increasingly important in calculating audience engagement. Publishers and advertisers even measure things like “dwell time”—when you hover over something without clicking—and focusing on content that grabs your attention long enough to interrupt your scroll through your feed—“thumbstoppers,” as Facebook likes to call them in mobile advertising.
Pageviews, uniques, and social media shares certainly can impact what publishers decide to cover in the future, especially if certain topics are profitable. A click or share drives attention, and in most cases, attention is translated into advertising revenue, which can sway publishers to cater to topics that are popular and thus sell more ads. That’s why clickbait and listicles exist, and why they do so well on social platforms that leverage sharing and recommendation engines.
Molly’s clicks are more likely to have an immediate impact on the reading recommendations she sees on Facebook or news sites. Based on her previous history, she’ll see more articles about Bernie Sanders and the election, and fewer articles about terrorism. Her clicks will have a much smaller impact on other readers, as her click profile feeds into recommendation engines that map readers’ interests across a website. As a body of signals her clicks could have an influence—albeit a minuscule, statistically insignificant one—in linking up connected interests that could drive recommendations for readers whose behavior overlaps with Molly’s.
Of course, if she’s not careful, Molly’s clicks could end up looking like a bot, a click farm selling likes, or an app store download farm manipulating popularity rankings. An entire market of services exists to mimic user behaviors to boost traffic numbers or trigger a charge-per-click on web ads: click fraud. The line between what Molly is doing—clicking extra for a cause—and what click farms are doing—clicking to fabricate traffic—is mostly a matter of scale and intent.
Some might dismiss these tactics as clicktivism, or as slacktivism, forms of lazy civic participation online. I like to think of what she’s as something more like conscientious clicking.
She’s not doing it to feel better about herself, or to make a statement for her profile about the causes she believes in. She’s not watching a Kony 2012 video and calling it a day. She’s deliberately directing her attention, aware of how her choices might contribute to the larger attention economy and public discourse.
“It’s a weird thing to do when you’re not sure if it has any impact. I just have this faith that it must because big data is such a powerful tool. It’s wishful clicking. Or not clicking,” Molly says.
In her view, it’s kind of like political canvassing: “When you canvas for a candidate you don’t really know if you are making an impact,” she says. “But you do it anyway. And on a large scale studies show that canvassing does make a difference.” Molly believes her conscientious clicking has some impact, and has been inspired by the that grassroots campaigning and hands-on presence on social media that seems to be building up a groundswell of support for Bernie Sanders.
Molly’s extra clicks and abstentions are signals that say more about the publications she frequents than they do about her support or disagreement with the causes she’s trying to affect. As Tim Hwang recently explored in an Atlantic article, though reader attention does translate to ad dollars, that indirect connection does little to address, protest, or boycott a social or political issue. “The loss of an average user matters such a minuscule amount financially that the importance of the boycott narrows to being symbolic and personal, rather than striking directly at a bottom line,” Hwang argues. Choosing not to link or not to click to avoid directing traffic to an article is a political and ethical choice.
Algorithms, too—no matter how much tech companies claim them to be—are not neutral. They have politics. They are designed by people, with certain biases and assumptions about what they are meant to optimize for, and how they are supposed to work. Algorithms that process articles will value clicks, if that’s that’s the metric we use as a proxy for attention, importance, or relevance. If we care how articles are surfaced and how they influence popular discourse online, then we have to look to metrics beyond the click.
Molly’s questions touch on the challenges we face trying to interpret the algorithms around us. We can guess at how they work, by understanding more about what inputs they pay attention to. But it’s harder to know what they value, how they weigh certain inputs over others, and how that influences the outputs we can’t see. In her small way, Molly is trying to influence the discourse using the tools available to her. She recognizes the value of her clicks, so she is attempting to deal in the currency of the system.
Are you a conscientious clicker? Share your algorithm gaming tactics in the comments. And send in your questions or examples of how you are living with data to firstname.lastname@example.org. Screenshots and links are a helpful place to start.
Sara M. Watson is a technology critic, a research fellow at the Tow Center for Digital Journalism at Columbia, and an affiliate at the Berkman Center for Internet and Society at Harvard. She tweets @smwat.
Lead illustration by Tara Jacoby