The New York City Police Department has established a large drone team for emergencies and published a usage policy for its officers. The units will be piloted by Federal Aviation Administration (FAA) Part 107 licensed operators who are in the NYPD Technical Assistance Response Unit (TARU), whose role is to provide audiovisual technical support to the Department and other city agencies. This can include assisting the Emergency Service Unit (ESU) in significant incident situations such as barricade/hostage, search & rescue, hazardous materials, and other accident and crime scenes. The TARU is also tasked with installing and operating surveillance equipment in investigations, and has a fleet of support vehicles.
The NYPD procured 14 drones earlier this year from China’s Dà-Jiāng Innovations (DJI), the world’s market leader for c-drones. The fleet includes 11 Mavic Pro models for quick deployment in tactical operations, 2 Matrice M210 RTK models equipped with Zenmuse X4S gimballed cameras and thermal sensors for 3D mapping and search & rescue using Pix4D software, and an Inspire 1 for training. The NYPD currently has 29 licensed drone pilots and a FAA Certificate of Authorization (COA) for flying in New York City’s controlled Class B airspace [PDF], which has three large airports and a number of smaller airports, heliports, seaplane bases, and sensitive sites.
In a press release, Police Commissioner James P. O’Neill stated: “As the largest municipal police department in the United States, the NYPD must always be willing to leverage the benefits of new and always-improving technology. Our new UAS program is part of this evolution — it enables our highly-trained cops to be even more responsive to the people we serve, and to carry out the NYPD’s critical work in ways that are more effective, efficient, and safe for everyone.”
At an open-air press conference on December 4 at Fort Totten in the borough of Queens, main site of the TARU unit, the NYPD demonstrated three scenarios demonstrating drone deployments. In an introductory speech, Chief of Department Terence A. Monahan, the highest-ranked uniformed officer on the force, said:
Good afternoon everyone. Thank you for coming out here to our TARU headquarters. We’re here to make a significant announcement. As of today, the NYPD is taking another important step in public safety as we begin utilizing Unmanned Aircraft Systems, commonly known as drones. This cutting-edge technology joins a long list of tools at the NYPD’s disposal during emergency situations, all of which help the men and women in blue uphold their mission: to protect every New Yorker. The use of drones will also enhance officer safety. Whether responding to a hazardous material spill, a hostage situation, or an area that is rendered inaccessible, drone technology will give our cops and their incident commanders an opportunity to see what they’re getting into before they go into harm’s way. Frankly, for this reason alone, it would be negligent for us not to utilize this technology. The NYPD is aware that there are some concerns about this announcement, so let me be clear, NYPD drones will not be used for wireless surveillance. NYPD drones will be used to save lives, and enhance our response in emergency situations.
Today we will carry out three scenarios to demonstrate the value of this technology. You will see how it would be used during a hazardous material spill, a vehicle collision with fatalities, and a missing person case. As a leader in the law enforcement community, we take pride in never becoming complacent and we continually strive to improve. We also take pride in being thorough. Drone technology has been around for some time. Many of our law enforcement partners and the communities they serve have benefited from it. We have consulted with these agencies to learn about their programs. The NYPD has also proactively sought feedback from local officials as well as advocates. Our training program for the pilots of these aircraft was also designed in close coordination with the Federal Aviation Administration. All of these steps have been taken to ensure a seamless and successful launch of this valuable tool. We want everyone to understand what the mission will be, and what the mission will not be.
Three days later, during a radio interview with host Jeremy Hobson, Chief Monahan expanded on these remarks:
We can use [drones] at hazmat locations, traffic and pedestrian monitoring at large events such as Times Square on New Year’s Eve, which actually will be the first time that we will be utilizing the drones at one of these events. And when we use it at Times Square or an event like that, it will be on a tether which is basically a cord that it’s attached to and it will be an area that’s cleared out below the drone, so God forbid if the drone falls, it falls into an area where there are no people…
There are 900 agencies across the country right now that are using drones. We took our time and really studied the usage and how they can help us do our job better to protect this city. And we took a lot of input in before we decided to use drones. In fact, we sat down with the NYCLU, just to get their input. They’re not completely thrilled with everything in our policies. We did feel it was necessary to talk with them before we implemented…
[A drone] is definitely not going to be used for routine patrol. It’s definitely not going to be used for traffic. It is never, and I say never, ever going to be weaponized. So you’re never going to hear about tear gas or anything else being on a drone. That is absolutely forbidden by our policy. It’s not going to be used for surveillance on individuals. This is going to be used to specific police functions. As a matter of fact, last night we used it for the first time. We had the shooting incident in the Bronx [more info — Ed.] which covered a very large area and we wanted to make sure we document the crime scene as best as we can, so we put the drone up late at night and we were able to take cam pictures all along the route that can be fused together and really give a good view to any prosecutor who’s going to be used in that case, so it really is something that is going to enhance how we develop cases… It worked beautifully. We got great video that we’re compiling together that can show a jury or prosecutor exactly what the scene looked like the night of the incident.
The NYPD published its usage policy concerning drones the day of the press conference, as Procedure No. 212-124 [PDF] in the Patrol Guide. The text mentions a report form designed for logging drone flights, Unmanned Aircraft System (UAS) Deployment Report PD620-151. The NYPD declined to make available a blank report, stating only that the NYPD will voluntarily report aggregate data about the program.
The Scope section of the policy contains the following text:
- Search and rescue operations
- Documentation of collisions and crime scenes
- Evidence searches at large or inaccessible scenes
- Hazardous material incidents
- Monitoring vehicular traffic and pedestrian congestion at large scale events
- Visual assistance at hostage/barricaded suspect situations
- Rooftop security observation at shootings or large scale events at the direction of the Incident Commander
- Public safety, emergency, or other situation with the approval of the Chief of Department.
The Additional Data section further clarifies:
- A UAS cannot be used for the following purposes:
- Warrantless surveillance
- Routine patrol
- Traffic enforcement
- Immobilizing vehicles or suspects.
- Absent exigent circumstances, a UAS will NOT be deployed in areas where there is a reasonable expectation of privacy (e.g., to look inside of residences), without first obtaining a search warrant that explicitly authorizes the use of a UAS. After the search warrant is issued, a UAS may be used for a pre-warrant execution safety survey
- A UAS will NOT be used as a weapon or equipped with any weapons
- A UAS will NOT be equipped with facial recognition software
- UAS footage will not be subject to facial recognition analysis, absent a public safety concern
- Use of a UAS must be consistent with the Handschu Guidelines outlined in P.G. 212-71, “Guidelines for the Use of Video/Photographic Equipment by Operational Personnel at Demonstrations.”
The Handschu Guidelines arose from 1980s litigation and more recent legal challenges concerning the NYPD’s surveillance of political organizations.
In a statement, The New York Civil Liberties Union rejected the safeguards of the NYPD’s drone policy:
Police cameras in the skies of New York City offer a new frontier for both public safety and abuses of power. When the NYPD provided us with an early look at a draft policy that would govern the Department’s deployment of drones, the NYCLU expressed serious concerns. The NYPD did make some changes, but we continue to believe the NYPD’s drone program poses a serious threat to New Yorkers’ privacy.
The NYPD’s drone policy places no meaningful restrictions on police deployment of drones in New York City and opens the door to the police department building a permanent archive of drone footage of political activity and intimate private behavior visible only from the sky. While we appreciate the NYPD’s willingness to meet with us before it announced this program, we believe the new policy falls far short of what is needed to balance the department’s legitimate law-enforcement needs against the privacy interests of New Yorkers.
The C-Drone Review has received permission from the NYCLU to publish the letter [PDF] sent to the NYPD on October 12, following their meeting the week before to discuss the NYPD’s draft drone policy. Among the points of concern outlined by the NYCLU is the facial recognition issue. Point (e) of the Additional Data guidelines in the policy stating “UAS footage will not be subject to facial recognition analysis, absent a public safety concern” is vague — the policy does not indicate the criteria of a “public safety concern”, nor who decides when to employ facial recognition. The NYCLU’s letter on this point says:
We appreciate that the policy prohibits equipping a UAS with facial recognition software. However, nothing prohibits the use of such software on the footage recorded by a UAS while that footage is retained by the Department. Depending on the frequency of their deployment, UAS devices may generate recordings that capture the movements and activities of hundreds of thousands of people. If those recordings are subjected to facial recognition or any other form of automated analysis or analytics, it would constitute a substantial intrusion into the lives of countless New Yorkers. And while facial recognition is one current area of concern, the tools and technologies available for use with UAS devices and footage will continue to advance in ways not contemplated by the proposed policy, including software capable of recognizing particular behaviors or a person’s gait. The Department’s policy should ensure that UAS will not be equipped with and that footage captured by UAS will not be subjected to any form of image analysis with the intention of identifying individuals or generating descriptive metadata on them.
Other civil liberties groups have warned against law enforcement use of facial recognition technology, such as the Electronic Frontier Foundation (EFF) and Georgetown Law’s Center on Privacy & Technology, which is in litigation with the NYPD to obtain clarity concerning the facial recognition software vendors and usage policy of the Department’s Facial Recognition Section, part of the Real Time Crime Center (RTCC) at police headquarters. New York State’s Department of Motor Vehicles (DMV) uses facial recognition to insure individuals do not obtain multiple licenses (in the US, there is no national identification card as in other countries, so a driver’s license issued by one of the 50 states is often used as primary identification). The US Federal Bureau of Investigation maintains a nationwide database, the National Crime Information Center (NCIC), which contains 14 persons files such as Immigration Violator, Missing Person, or Protection Order. It is not known if the NYPD attempts to match its photos (mugshots, video camera footage stills, or witness-supplied images), or other photos such as found on social media such as Facebook or its subsidiary Instagram, with these or other databases beyond its own internal database of arrested persons. A Wall Street Journal article earlier this year indicated that the Facial Recognition Section has access to driver’s licence ID photos from 31 states, according to Clare Garvie at Georgetown CPT, although New York is not among them.
A recurring theme in studies is “algorithmic bias” or “coded gaze”, flawed design which returns different levels of accuracy according to the gender or skin shade of a person. Researchers from the Massachusetts Institute of Technology (MIT) Media Lab and Microsoft Research found in a study comparing IBM, Microsoft and Face++ artificial intelligence (AI) services [PDF] that face recognition was measurably less accurate for women than men, in particular dark-skinned women. In response, IBM released a statement stating their deep commitment to deliver services which are “unbiased, explainable, value aligned, and transparent”, while Microsoft told the researchers: “We believe the fairness of AI technologies is a critical issue for the industry and one that Microsoft takes very seriously. We’ve already taken steps to improve the accuracy of our facial recognition technology and we’re continuing to invest in research to recognize, understand and remove bias.”
The US Government Accountability Office (GAO) published a report last year, finding that the FBI “had not fully adhered to privacy laws and policies and had not taken sufficient action to help ensure accuracy of its face recognition technology.”
In this context of poor oversight of facial recognition technologies, the Algorithmic Justice League and the Center on Privacy & Technology at Georgetown Law launched yesterday the Safe Face Pledge, an initiative to build support among technologists and companies to avoid facilitating facial recognition government surveillance, mitigate law enforcement abuse, eliminate bias in the technology, and among other goals promote transparent analysis of tools on the market such as Idemia/Safran’s Morpho suite, Amazon’s Rekognition, and IBM’s Intelligent Video Analytics i2 Facial Recognition Solution.