Facebook recently updated its community standards. As the company noted in the announcement accompanying the change, their "policies and standards themselves are not changing," but that they wanted to provide more clarity to a set of existing rules that have often been misunderstood by users.
While some of the changes provide significantly more detail as to the reasoning behind certain content restrictions, others fall short. And unfortunately, the updated standards do very little to solve the continuing problem of account suspensions for "real names" violations.
Even in the last week and a half Facebook has continued to suspend users for violations of its "real names" policy, a policy which we've argued causes real world harm. In the latest story to get publicity, a teen with the legal name Isis King had her account suspended by Facebook for a names policy violation—until a media inquiry. The latest update to the community standards won't change the experience of users like Isis King, but it does clarify where Facebook stands.
Prior to the change, the standards read: "On Facebook people connect using their real names and identities." Because Facebook asks for ID when handling appeals and blocks certain words from being entered in the "name" fields at account creation, most users have assumed that when Facebook says "real name," the company really means "legal name."
Following a spate of account takedowns last fall, however, Facebook's Chief Product Officer, Chris Cox, posted a statement in which he said: "our policy has never been to require everyone on Facebook to use their legal name." Shortly thereafter, we noted a shift in the company's language in notifications to users. A section on account security in the Community Standards now reads, in part:
Using Your Authentic Identity: How Facebook's real name requirement creates a safer environment.
People connect on Facebook using their authentic identities. When people stand behind their opinions and actions with their authentic name and reputation, our community is more accountable...
Nevertheless, the company's Statement of Responsibilities—the legal text underpinning the Community Standards—still contains language referring to real names:
Facebook users provide their real names and information, and we need your help to keep it that way.
While we're glad to see that Facebook is changing how it communicates this guideline to users, it's a very small change in the face of the continuing reports that Facebook is suspending users' accounts for name policy violations.
Facebook's content policies—and how they are implemented—have often left users confused. For example, the company told us that images of mothers breastfeeding were never meant to be restricted, yet numerous instances of such photos being removed have led to a persistent belief that the company bans such images.
The latest iteration of the community standards is intended to provide additional clarity to users. As the New York Times' Vindu Goel put it, "[Despite] its published guidelines, the reasoning behind Facebook's decisions to block or allow content are often opaque and inconsistent."
In respect to some topics, Facebook has certainly met their goal. The section on sexual violence and exploitation, for example, lays out numerous examples of what the company deems unacceptable. A section on "attacks on public figures" clarifies that Facebook does not remove criticism of public figures...unless it constitutes hate speech, in which they treat the content as they would if the target were not famous.
Other sections leave more to be desired. While Facebook's rules about "dangerous organizations" make clear that groups engaged in "terrorist" or "organized criminal" activity have no place on the platform, there is no additional clarity on how terrorist groups are defined, despite some evidence that the definitions are underpinned by US law.
If a person's account is suspended, those appeals are read by real people who can look into the specifics.
Although Facebook instituted an appeals process in 2011, the process is only available for users whose Page or Profile has been removed; that is, there is no process for appealing when other content—such as photos, posts, or videos—are removed. Furthermore, the process is ambiguous and doesn't seem to make much of a difference to users, many of whom have contacted us following account suspensions.
The appeals form itself is hard to find. It's accessible through the help center. But Facebook doesn't seem to actually highlight it as an option in the endless screens users find themselves in when trying to verify their "authenticity." Once users find themselves in that process, they are directed to update their name, instead of being sent to the appeal. When they click on the link Facebook provides to its help center during the name verification process, that link goes to lists of ID, not to the appeal.
In fact, the appeal isn't available unless an account has been entirely disabled. Some users have had the experience of providing ID to Facebook with a legal name that didn't match their real name, only to have Facebook put that legal name on the account. We've been contacted by users with abusive stalkers, users who have public-facing jobs that use their drag name, and others who've had this experience. Those users can't access the appeals form once their account is erroneously restored.
Finally, in an impressive display of irony, the appeals form requires users to upload an ID. In other words, it requires users who are having issues with Facebook's process of verifying identity (using an ID) to restore accounts to do exactly that— upload an ID, before even getting the chance to talk to someone. Considering that accounts have been restored with incorrect names in dangerous situations, users' hesitancy to upload an ID just to file an appeal is understandable.
If Facebook cares about its users, it should make its appeals process easier to access and easier to use. It should allow appeals for all types of removed content, not just Profiles and Pages. And it certainly shouldn't require ID as the first step.
While we think it's good that Facebook decided to provide more clarity about its policies, it might be better served by improving those policies and ensuring that Facebook is an accessible, open platform for its millions of users worldwide.
This article first appeared on Electronic Frontier Foundation and is republished here under Creative Commons license.