The Police Foundation doesn’t want the police to call drones “drones.” Because of the public’s association with “military-style weapons like the Predator,” the organization’s 311 page report reads, the term “drone” is “a major obstacle to law enforcement’s ability to convince the public” that police drone programs “could actually increase public safety, not jeopardize it.”
Instead, the report—“Community Policing and Unmanned Aircraft Systems (UAS): Guidelines to Enhance Community Trust,” published in 2016—suggests the clunky “Unmanned Aircraft System” as an alternative. UAS was coined by the Department of Defense in 2001, now repurposed by police, to make military weapons more palatable.
To market the drone as part of a public safety system instead of a weapon, the first step is to convince a suspicious public that it isn’t a weapon, but a friendly flying gadget. Commercial dronemaker DJI has made this a priority after announcing a partnership with body camera manufacturer Axon, formerly Taser. Axon is now offering law enforcement agencies two surveillance drones: the Phantom 4 Pro, eggshell white and equipped with image recognition software; and the more expensive Matrice 200 series, metallic grey and tested for resilience to harsh winds and rain. DJI furnishes the drones, while Axon integrates the data they collect with Evidence.com, its cloud storage system used for managing body camera footage.
DJI’s marketing features videos of the Matrice used to film exotic vacation and rescue lost hikers. But, anti-surveillance activists are concerned framing drones around cinematic rescue missions obscures the privacy concerns when drones are used for policing.
This is a chief concern among police technology experts: is technology defined by how it’s used in day-to-day or extreme scenarios? Gizmodo spoke with representatives from both DJI and Axon, including members of its’ newly formed Artificial Intelligence Ethics Board, and to understand a simple question: What problem does police drone surveillance actually solve?
“I don’t think anybody would question that if it’s a real life-or-death search for a missing child that you would allow a drone to fly anywhere it needs to,” Adam Lisberg, Corporate Communication Director for DJI, told Gizmodo, “the same way that you would hopefully allow searchers access to any property they need to find that child.”
The Matrice 200 series is optimized for search and rescue missions, tested for wind and rain resistance and capable of carrying payloads of food or radios. In rescue missions, Lisberg explains, drones could survey specific areas, using long-range livestreaming cameras and thermal imaging to detect survivors or missing children without expending the manpower of launching a full search party.
“These drones are going to get more popular,” Steve Tuttle, Vice President of Strategic Communications for Axon, told Gizmodo. “They’re ever expanding, gonna be used for more crime scene evidence, for more search rescues, [and are] gonna be used to check out what are called ‘fatal funnels.’”
“Fatal funnels” are confined spaces, stairwells or hallways usually, that trap police without cover. In 2012, a Utah man shot six members of a strike team, killing one. The prosecutor on the case said he planned to “go out in a blaze of glory,” hiding while police cleared the lower floors of his home, then opening fire once they entered a narrow hallway. Drones, for example the Phantom 4 Pro with its small size, object detection, and infrared systems, could scout dangerous areas ahead of officers.
“It’s a way to see things safely,” Tuttle adds.
Like many police technologies, there’s no unifying nationwide policy on drones. How each department will actually use drones is not entirely clear. The FAA requires certifications and other restrictions (piloting near jails or airports is forbidden, for example) but doesn’t offer specific guidance on most police uses. Without this clear legal framework, police drone policies vary widely from agency to agency.
“We don’t have a good system for police to share best practices. It’s still an antiquated word-of-mouth [system],” Jim Bueermann, AI/Ethics board member and President of the Police Foundation, which released the 2016 report, told Gizmodo. “What vendors can do, because they know a lot about the products and because they interact with so many agencies—they are a wonderful resource of best practices. It’s hard to get multiple lessons from multiple police departments at the same time. These vendors serve as de facto knowledge centers around best practices.”
Axon and DJI don’t take specific policy positions, though both encourage police work with communities to form their own regulations. This creates recommendations and best practices, but no enforcement, leaving an opening for wildly vacillating uses.
In Kentucky, city officials proposed using drones in coordination with gunshot detection technology. Under the system proposed in a federal grant application, if acoustic surveillance devices detected the sound of gunfire, they’d record their location and send the coordinates to drones. The proposal came in response to a longstanding problem: People in areas with high crime don’t call 911 when they hear gunshots. Drones could theoretically capture evidence of suspects and witnesses, but does sending in drones “enhance community trust,” to borrow the Police Foundation’s phrase, if there is none?
In Chicago, Mayor Rahm Emanuel backed a bill that would permit police drones to surveil protestors. Protests can turn violent, surely, but they’re also a First Amendment protected activity. Drone surveillance could deter people from attending, for fear of being surveilled by the government, especially if it is the government they’re protesting.
In May, Oakland passed what the American Civil Liberties Union and Electronic Frontier Foundation consider to be among the “strongest” anti-surveillance regulations, requiring police to get city council approval before acquiring new surveillance devices or even soliciting funding. The ordinance requires public hearings, invalidates many NDAs, offers whistleblower protections, and limits data retention. This is where the power of framing the devices comes into full view. The public may want drones because they’re framed as safety devices, but then police can use them as surveillance devices.
“I think that the police technology space is screaming for regulation,” Barry Friedman, director of the Policing Project at New York University School of Law and another AI/Ethics board member, told Gizmodo. “If vendors and police departments do not start to self-regulate, then they will at some point, in the not too distant future, find themselves regulated.”
Friedman hypothesized a regulatory scheme outside of a governing body that combines third-party audits, self-regulation, and specific-use warrants, getting permission from a judge if police wanted to use drones for anything outside of normal operations. He said, “I think it would behoove the policing tech industry to do some self-regulation, to think about the kinds of things that it builds, for example—whether they build accountability into the item.”
One example is Axon’s “buffering” feature, which instantly records the last 30 seconds before a body camera is turned on. The feature caught a Baltimore cop seemingly planting drugs that he then turned on his camera to later “discover” and use as evidence. The same feature has led to multiple officers unwittingly filming themselves during acts of police misconduct.
Built-in accountability isn’t perfect, however. Last year, body cameras failed to record the death of Justine Damond, shot by police in Minneapolis. State policy required officers to turn on their cameras during investigations. They did not. Via the “Axon Signal” device Minnesota police purchased, body cameras can be configured to turn on automatically in conjunction with dashboard cameras or opening car doors. The feature wasn’t enabled and footage of Damond’s death, or the actions taken by the responding officers, was never recorded. Technology and policy can be extremely helpful, but no foolproof in regards to officer misuse, a point that some critics feels is being overlooked.
Hamid Khan, Campaign Coordinator for the Stop LAPD Spying Coalition, says agencies have a long history of using anomalies to obscure averages. Khan argues that while the terms of the public drone debate is framed around more rescue and prevention uses, agencies will always leave the door open for more troubling uses, even if policy initially prohibits it.
“The capacities and capabilities [of drones] need to be seen not just in that single tool, but in how it fits into larger architectures of surveillance and information gathering.”
Rather than looking at these technologies individually, it’s best to look at them in tandem, especially since the point of Evidence.com is to join together footage from varying sources: CCTV, body cameras, drones, even cell phone videos submitted by the public. Useful when reconstructing a traffic accident, but deeply concerning if used at protests. Without steadfast laws preventing “mission creep,” where tech is used for reasons other than intended, Khan worries officers themselves will set the terms.
“Drones signify what ‘mission creep’ is,” Khan argues.
A 2017 report from Stop LAPD Spying draws parallels between drone usage and LAPD use of helicopters and SWAT teams. As the report asserts, in both cases police mechanisms are introduced for use under “limited circumstances,” but eventually became routine. SWAT, the report notes, was originally demarcated for specific instances of rioting in 1967, but now is used to serve warrants and search for drugs. Helicopters, when introduced in 1956, were for traffic control, but are now used to track fleeing suspects and surveil cities from above. A May study from Bard’s Center for the Study of the Drone found twice as many agencies, including firefighters and rescue, own drones than manned aircraft. The report estimates 68 percent of these drones were furnished by DJI.
If drones offer police enhanced abilities, Khan argues, how long until these extraordinary enhancements become normalized? Via Evidence.com, officers have access to advanced searching and editing tools, aided by computer vision and object detection, letting them pinpoint even minuscule details easily. What does it mean when they can do this from a mile in the air?
“An individual company being responsible is not going to be sufficient to solve the problem of ethical/legal use of law enforcement technology in the face of diverse technology providers with varying ethical standards, sometimes overly permissive local policies, and individual bad actors,” Miles Brundage, another boardmember and an AI policy research fellow at the University of Oxford, told Gizmodo over email. While Axon itself can’t regulate all police drones, Brundage said, they can develop internal best practices, which would influence self-regulation throughout the industry and perhaps stymie the most alarming uses.
“The board has already discussed the internal controls used to ensure the confidentiality and integrity of body camera data, and related discussions will need to be had for drone-related data,” Brundage said.
Questions of accountability remain unresolved with body cameras. In Baltimore, officers infamously manipulated body camera footage to abet a safe robbery, filming themselves “discovering” thousands of dollars in cash when really they’d already found and pocketed half of the money before pressing record on their cameras. A 2017 report from policy nonprofit Upturn found that police departments have increasingly made it harder for the public or media to see body cameras footage, marring their original purpose as accountability tools. Forty percent of all body camera footage is never seen by the public. Who get to see drone footage? Who makes sure it’s telling the full story? And who is held accountable when either fails?
“We’re the company that makes the drones,” DJI’s Lisberg explained. “We would certainly advise law enforcement agencies to come up with a strong policy for how they will use drones and the data they collect, but it’s not really our place to suggest ones to them.”
The question remains: who will? The public knows better than to ask police to police themselves. Tech companies claim to care about public safety, but police accountability is a public safety issue as well. And yet, technology companies continue to augment police powers without forthright, robust enforcement goals.
Activists hopeful for quick fixes provided the policy cover for body cameras, but turned on them after misuse. Hopefully, the distractingly lavish vacation videos won’t provide the same cover for drones, before startling abuse arises. Otherwise, our rights could disappear right above our heads.