When you were a kid and stole your friends’ toys, your parent probably asked you this angry hypothetical: “How do you think that made them feel?” But what if you actually could feel what another person is feeling? This week, we travel to a future where humans have invented an empathy machine.
Before we dive into what that machine might do to our world, let’s pause for a minute to think about what empathy actually is. There are two forms of empathy: cognitive empathy and emotional empathy. So says Maia Szalavitz, a journalist and coauthor of the book Born for Love: Why Empathy is Essential — and Endangered. She walks us through the differences between the two types.
Already, we’re getting to one of the huge hurdles with this idea. Empathy is a nebulous concept, and the way each of us experiences it in the world is really different. So if we start thinking about how this empathy machine might actually work, we start to struggle with the uniqueness of each human’s experiences. How might a machine like this operate? If it simply records a pattern of neurons firing in your brain, and copies it into my brain, it probably won’t make me feel what you’re feeling.
I also talked to Samantha Rich, who equates this issue with the age-old stoner question: Is the blue I see the same blue you see? In the case of emotion and memory: Do I experience fear or love or excitement the same way you do? Unfortunately, nobody can really answer that question. Rich is a science fiction writer who recently published a fascinating short story in an anthology called Accessing the Future. It’s set in a future where people wear little screens on their chests that broadcast their neural activity. So everybody can see what’s going on with everybody else all the time.
Rich’s story raises some really interesting questions about how much information we should really know about each other, and how much insight one can truly get about a person from seeing things like biometric data. Just because you’ve got information about what a person is feeling, doesn’t mean you can really understand that feeling.
But let’s say that, in this future, we have some good way of actually exchanging full emotional experiences. That I can put on a helmet and really feel what you felt at a certain moment. Really understand it. Some think that that kind of device would be incredibly powerful. One of those people is Heather Schlegel, a social scientist and futurist who thinks a lot about how technology and intimacy interact.
A few years ago, Schlegel worked on a project called UME, an imagined future company that offered a service very similar to the one we described in the beginning of this podcast. “UME is the leading company in emotion extension technology. At UME, we connect how you feel with your loved ones,” their website says. As part of this design future, Schlegel and her team came up with some theoretical news story topics describing the public’s reaction to this kind of technology. They include things like this: athletes use the emotional connection with their coaches, to get better insight and feedback; surgeons and nurses connect with their patients to better understand their needs; a property developer creates a UME community where all individuals must have embedded empathy technology; a primary school rejects any students without an empathy connection; UME encounters religious pushback, seen as interfering with “higher connections” to God.
If you could get this kind of system to work, there would be all kinds of great applications. People could use it to become closer, to understand other people’s emotions, to feel experiences they might never get the chance to have themselves. The justice system could use it as a punishment, forcing criminals to feel the pain their crime caused the victim and family. People with social anxiety could use it to better understand the interactions they’re having.
But here again is where we need to pause. It seems like being able to experience other people’s emotions and mental states would be a good thing, right? That it could even change people’s minds about subjects, give them a better understanding of people whom they haven’t met, and maybe even purge the prejudices that most people harbor inside them. But is that actually true? Does experiencing other people’s emotions and memories actually make you more respectful of them as people?
Possibly, if done correctly, says Szalavitz. But we’d have to really be careful about how we use it. As Rich points out in her story, just having some pieces of information about a person’s emotions can’t really tell you everything. And it’s still unclear whether it’s possible to experience another person’s feelings at all, without our own history and memories and biases mucking it all up.
Here’s the thing. Humans, as a species, have a huge capacity for empathy. Much of our world and economy is built on trust, which requires some amount of empathy. And, in fact, we already kind of have an empathy box. It’s called the Internet. The internet can carry stories of experiences you’ve never even dreamed of, ones you’ll never have yourself, to you. And if we listen to those stories and try to understand what they’re telling us, we can in fact do a lot of the work that this future technology might do for us. So while we wait for this magical box to come, we can turn to another magical box in front of us and do a lot of that work ourselves.
Besides, as Szalavitz points out, there’s at least one invention we have right now that greatly increases empathy: books.
As usual, if you have thoughts about futures we should explore on the podcast, leave us a note in the comments, on Twitter, or email us at firstname.lastname@example.org. We also have a Facebook page now! Check it out. You can subscribe to the podcast on iTunes, Soundcloud or via whatever RSS reading app you chose.
Illustration by Sam Woolley