Stop Turning Me Into Bad Data

We may earn a commission from links on this page.

It just happened again. Some dumbass with an algorithm has decided that I belong in a demographic and is feeding me "relevant" information. But none of it is relevant. And now I'm really pissed off, for reasons that you will never guess because they don't fit your goddamn predictive models.

Image from Ghosts with Shit Jobs

There are a few basic ways that the internet talks to me, and here are two of them, boiled down to their essences:

1. You like science and technology. You like action movies. You must be a male who is also interested in cars and sports. Would you like to buy this bottle of rum?


2. You are a female. You must want a backpack that is pink or lavender. You will be interested in this picture of a baby. Would you like to know more about the best places to have a wedding?

Sometimes I feed the algorithms a random piece of data, just to see what will happen. I told Facebook that I was born in 1923. Now I see a lot of ads for Ensure, retirement funds, and books.


The point is, and I'm not exactly blowing your mind with new information right now, is that many of the services we use online every day try to feed us information they assume we will want based on demographic details like gender, age, where we live, and more. (I have yet to be told I must be interested in handcrafted bicycles and documentary films because I'm white, but I'm sure that's coming.)

The problem is that these assumptions become self-fulfilling prophesies. If women only see pictures of pink backpacks for sale, eventually they'll buy one just because they need a backpack — even if they are tragically yearning for one that's black with racing stripes.


And then there's the inverse issue, when a company makes assumptions about who you are based on what you look at or buy. In my first example above, I mentioned that many online services assume I'm male because I'm interested in science. Or because I write about science fiction. That's because their algorithms or other systems have been programmed by lazy humans.

In the human world, it's this kind of lazy thinking that causes production companies to assume that audiences for movies and books about science are largely male — and then, as they slide down the rabbit hole of presumptions, to decide that means they shouldn't have important female characters or stories that deal with interpersonal relationships. Because men can't understand stories where relationships go beyond laser fights. And women — you know, the ones who are looking at those wedding spots online — aren't watching. This is insulting for everybody.


I'm not saying I want my online ads and shopping experiences to be better tailored to my actual preferences. I don't want to feed Facebook or Google all my personal information so that I morph from a demographic into a psychographic. What I actually want is to work and play on an internet that doesn't feed me information opportunistically based on what some idiot decided women like (based, I'm sure, on gathering data from individuals they assumed were women because of their consumer choices).

I feel like my mediascape has become an enormous behavior modification feedback loop. If men are interested in science, then this article must be for men; therefore we will suggest it exclusively to men, who will click on it and prove that only men are interested in science. See how that works? Now I've got a pink backpack to sell to you, lady. No, there are no other choices. When you buy it, you prove that we were right to offer you the pink one.


Welcome to the future, where our crap algorithms allow us to pretend that cultural stereotypes are objective truth based on big data.

Annalee Newitz is the editor-in-chief of io9 and this is her column. She's the author of Scatter, Adapt and Remember: How Humans Will Survive a Mass Extinction. Follow her on Twitter, or email her.