Most of us might think of Facebook as the social network of choice for suburban moms and conspiracy theorists, but the company hasn’t been shy about branching out to become much more than an app on our phones, even if that’s the last thing we want. Here’s an example: earlier today, Facebook put out a company blog post outlining its latest venture, this time into the wild world of medicine.
As the post explains, the company’s AI-research wing—called FAIR—has spent the past two years quietly toiling alongside professionals at the NYU Langone Health Center to create what they call fastMRI: an algorithm that promises to cut down on the long-as-hell process of folks typically undergo when stepping into an MRI machine. They just need some photos of your bones to do it.
Okay, not all of your bones—at least not yet. The first round of research for the fastMRI program, at least thus far, is based exclusively off of a massive open-source library of pictures from a slew of knee-MRI’s that NYU helpfully offered up for the sake of the project. By training a machine learning algo on these knees, the team was able to create an algorithm that can guesstimate an accurate MRI image using a fourth of the data that your typical MRI machine takes in when putting together a crystal-clear image of your bones or brain or what have you. Or put another way: because an algorithm’s doing the heavy lifting here, you get to spend less time getting photographed while trapped inside a weird, noisy metal tube.
You see, the main reason that any given session in an average MRI machine can end up lasting more than an hour boils down to the way those machines work in the first place, which is... a bit complicated to explain. In short: if a machine is, say, taking a scan of a given person’s head (or brain, as the case may be), that means applying super-strong magnetic forces to a person’s head, shooting that head with a radio current, and then forming a composite image based on the behavior of the god-knows-how-many protons in that person’s head once they’ve been subjected to those sorts of signals. As it turns out, the signals these protons give off can be very weak, which means that the whole mess might need to be repeated again and again in order to form a crystal-clear composite.
Using AI to cut-down on the time it takes to get that final picture isn’t a new idea by any stretch, andI’ll be the first to admit it sounds like a great idea—until you remember that Facebook’s one of the names behind this particular project. This is a company whose insane rate of growth is largely built on gathering our data, bundling that data, and then sliding it over to third parties like major advertisers or federal agencies. And that’s not including, y’know, all those massive data breaches the company somehow keeps finding itself at the center of.
And just like Facebook’s ambitions as a platform, its data sources have also been rapidly branching out: Facebook doesn’t just know what we’re doing on its platform or on Instagram, it also knows what we buy, where we buy it, and, well, a ton of other stuff, thanks to its oodles of partnerships across a buffet of industries, including—surprise surprise—big pharma and medicine.
This year alone, the platform’s made a noticeable push to court ad dollars from major medical brands, and it’s been working, thanks in part to the pharma-data we’ve been freely giving up about ourselves online. And if that part of our medical history’s being used for targeting, then, well, there isn’t much stopping Facebook from doing the same with any other sort of medical dataset, even one that’s sourced from our literal bones and organs. Generally, these sorts of details are supposed to be covered under legislature like HIPAA, but as we’ve covered before, the lines between what can and can’t be monetized get kind of blurry depending on whether that data’s sourced from a doctor or a tech company. And in the case of something like fastMRI—or something like Alphabet’s Verily—we’ve got a partnership between public health and privatized tech, meaning that HIPAA might not be protecting these MRI-scans as much as we’d hope.
Facebook undoubtedly saw some of the potential discomfort before putting out this blog post, because the company snuck in this little disclaimer towards the middle:
(The fastMRI data used in the project, including scans used for the study, are from the open-source dataset that NYU Langone created in 2018. Before open-sourcing the data, NYU Langone ensured that all scans were de-identified, and no patient information was available to reviewers or researchers working on the fastMRI project. No Facebook user data was contributed to creation of the fastMRI data set.)
Okay, so it looks like these bone-scans aren’t being used for tracking and targeting—at least, not yet—but the team’s own blog post makes it sound like the fastMRI project isn’t going to just stop with pictures of stranger’s knees. “Today’s clinical study is an important step forward, but there are many more advances to come. Next, Facebook AI and NYU Langone researchers want to show that fastMRI works just as well with other vital organs, such as the brain,” Facebook wrote.
Even if we’re going to scrap the whole “my bones are being used against me” narrative (which, I’ll admit, is more speculative than I’d prefer), there’s still a ton of reasons that you shouldn’t want this company anywhere near your medical data. This is a company that’s shown again and again that it’ll put its profit margin before the safety of its users, no matter how much it tries to pretend otherwise. Hell, the same day that Facebook put out this AI research, reports emerged that the company is still witholding evidence about its role in the wave of genocides in Myanmar back in 2017. Here in the states, the company’s continuously struggled to take any action against the antivaxx groups that’ve already caused at least one child’s death. And, of course, there’s the whole alleged goat murdering thing.
I’m not saying Facebook’s medical research won’t genuinely help anyone, but I am saying that maybe fastMRI would be less icky if it was being spearheaded by absolutely any other company.