Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

The Screen Actors Guild Wants to Protect Its Members From Deepfakes

Illustration for article titled The Screen Actors Guild Wants to Protect Its Members From Deepfakes
Photo: Getty

The labor union representing Hollywood’s actors, singers and other media artists wants to make sure its members aren’t digitally manipulated into porn stars without their consent. In the most recent issue of its union magazine, the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) said it’s “closely watching the development of so-called deepfakes,” Deadline reports.


Nonconsensual fake porn isn’t new, but affordable and accessible machine learning tools have made it simpler than ever for internet trolls to develop ultra-realistic fake videos. Called “deepfakes,” such videos gained widespread media attention at the end of last year when a subreddit dedicated to creating them attracted thousands of subscribers. The subreddit has since been banned, but it’s now clear just how easy it is to swap out a porn star’s face with someone else’s if you have enough reference images—an easy feat if your target is a celebrity whose face is plastered all over the internet.

SAG-AFTRA president Gabrielle Carteris wrote that the union is monitoring the tech that has “the ability to steal our images and superimpose them onto another person’s body in potentially unpleasant and inappropriate digital forms” and “fighting back when the technology infringes on our members’ rights.” A SAG-AFTRA spokesperson told Deadline that the union is reviewing ways to limit the use of unauthorized digital recreations of its members, not just in pornography, but also in advertisements, fake news, video games and movies.


The spokesperson noted that New York and Louisiana don’t have post-mortem rights of publicity. A right of publicity means that an individual has the right to control the commercial use of their likeness. These laws vary by state. If you die while domiciled in a state without strong posthumous rights of publicity, your likeness can essentially be capitalized on without the consent of your heirs. SAG-AFTRA wants to change that.

“We are talking with our members’ representatives, union allies, and with state and federal legislators about this issue right now and have legislation pending in New York and Louisiana that would address this directly in certain circumstances,” the spokesperson told Deadline. “We also are analyzing state laws in other jurisdictions, including California, to make sure protections are in place. To the degree that there are not sufficient protections in place, we will work to fix that.”

It makes sense that the labor union would be fighting for stronger rights of publicity in New York and Louisiana—a lot of celebrities live in these states and, as stands, do not have very strong protections for their likeness after they die. In fact, this was an issue following Marilyn Monroe’s death. Monroe’s estate said she was domiciled in California so that they could protect the late actress with the state’s post-mortem right of publicity. The claim was denied, and instead she was found domiciled in New York, a state with much weaker publicity protections.

“Stephen Hawking just died. If he were a resident of California at the time of his death, then his heirs, successors or whomever owns his publicity rights at the time he died would be able to enforce under California law,” attorney Joseph Rothberg recently told Gizmodo. “He would be able to stop people from just creating a fake Stephen Hawking for commercial use. But it’s a state by state thing.”


SAG-AFTRA’s legal efforts would help ensure that its members, which include over 150,000 media workers, wouldn’t have to worry about their images being exploited in death. And for the living, the union also said that it wants to “support new judicial theories to extend protections to individuals and their heirs who are victimized in fake porn videos.”

It’s heartening to see a powerful organization fighting back against a gross form of both harassment and commercial exploitation. As the tools to manipulate someone’s image without their consent become cheaper and easier to use, it’s hard to imagine why all states wouldn’t want to adopt stronger protections for someone’s likeness.


[Deadline via The Verge]

Share This Story

Get our newsletter


I would go a step further that using anyone’s image without explicit permission for the purpose of producing a compromising, slanderous, or otherwise directly and purposefully harmful video or image that is near-indistinguishable from reality - whether the victim is in the public eye or a private citizen. This is something that should be on the books to protect everyone. I know that’s not how this is going to go - ultimately the laws that get written have loopholes that allow one group advantages over another, but this is certainly something that needs to be addressed.

I know there is certain art and fair-use around such technology and demonstrations, and there will certainly be cases where people have signed releases for photography and video imaging and maybe even work to be done using those learning systems to produce art and video they did not directly create. But if such imagery is used in such a way to deceive the public and intentionally slander the individual involved, or is otherwise harmful to their person - then there has to be consequences. Its one thing to do a parody where its obvious that the technology is being used in such a way and that it is common knowledge the real person was not actually involved... quite another to twist someone who otherwise would not participate publically in such a thing and transpose their likeness such that it may be believed they did participate is something that should have strong consequences for the actions of those who perpetrated it.