Axon and DJI Partner Up to Equip US Police With Very Capable Flying Helpers
All the drone hype brought up in the recent years seems to be subsiding, giving way to ICOs and AI. But what if you take the so-yesterday drone debate, mix it with evergreen SkyNet-esque AI controversies and add just a pinch of ever so vigilant government? This time, you probably won’t like the answer, let alone its timing, as said mixture is already being brewed in the U.S.
On June 5th, the popular drone manufacturer DJI reached partnership with Axon, a company supplying police force with tech gimmicks, such as wearable cameras and Taser electroshock weapons. The scope of the partnership is to provide local law enforcement across the US with surveillance drones feeding video to AI analysis systems.
A Bird’s Eye View on Crime
In partnership with DJI, Axon starts Axon Air program aimed at providing a package solution to the police. The solution includes DJI’s Matrice 210 and Phantom 4 video capable drones connected to Axon’s cloud solution Evidence.com aggregating video footage from the body cameras worn by police officers. Axon will also provide its own AI capabilities to analyze police footages.
As a result, the solution is meant to be used for crime scene analysis, evidence collection, crowd monitoring, search and rescue operations, natural disaster response, and many other instances when its capabilities may have a decisive effect.
“DJI’s partnership with Axon allows law enforcement agencies to add drone capabilities and data services through the same trusted provider they rely on for the tools, data and support they need to do their jobs safely and effectively,” Michael Perry, Managing Director of North America at DJI said in a press release. “Law enforcement agencies are rapidly adopting drones for their work, and often need guidance on how to establish a drone program and integrate it into their departments. DJI’s Axon Air partnership will strengthen and enhance law enforcement’s ability to protect public safety, respond to emergencies and save lives.”
Yet, all those undoubtedly useful applications are somewhat obscured by substantial concerns regarding the respect for privacy and civil liberties altogether.
Surveillance Took Off to the Skies Long Ago
There are several cases to consider when musing about the possible ramifications of aerial surveillance and advanced video analysis systems.
In 2016 Baltimore police was using wide-angle cameras mounted on a Cessna plane to watch the city from above. The system was in use during the protests related to the death of Freddie Gray, who was allegedly killed in so-called “rough ride”, a form of police misconduct when policemen put their suspect handcuffed in a van and then throw them about by recklessly driving the vehicle. Notably, while the extensive ground and aerial surveillance systems should’ve recorded the van in question from along its entire route, they didn’t seem to be helpful in resolving the case. Adding to this controversial case of surveillance systems’ “contribution to public safety” is the fact that the police department hasn’t disclosed the program to the public. The whole thing was revealed only when Bloomberg published its report on the matter.
Another important case took place in Louisville, Kentucky, where municipal authorities decided to use automated drones connected to previously deployed gunshot detection system ShotSpotter. Deploying the drones helps locate suspects and analyse crime scenes, but such an extension to the existing surveillance systems is raising yet another bunch of concerns about privacy and anonymity of bystanders who get in the view by accident.
Furthermore, in May 2018, Illinois authorities have passed a bill permitting the police to use drones and facial recognition tech for surveillance at “large-scale events.” As it seems, the bill’s goal is to ensure public safety at festivals or concerts, but it also allows the authorities to observe crowds at political protests and rallies. Thus, there is a possibility that individuals attending such events can by easily identified and targeted. Notably, the new rules seem to include loopholes allowing police not only to use video-capable drones, but also to weapon them with things like tear gas and non-lethal rubber bullets.
Obviously, all of the advancements and innovations in surveillance mentioned above are meant to ensure public safety and won’t be used to hunt down potentially disloyal individuals and political nonconformity. Or so they say.
Privacy and Civil Liberties Activists Are Concerned
While an airborne surveillance is by itself disturbing to some, the actual usage of the footages taken from air, CCTV, or policemen’s body cameras is what brings up the most significant problems. And many of those are centered around AI-driven processing.
As a move to address potential ethical and constitutional issues regarding the use of AI-based advanced analytics systems, such as face and license plate recognition systems by law enforcement agencies, Axon has established the AI Ethics Board and promised to publish “one or two” reports describing the Board’s activities annually.
“We believe the advancement of AI technology will empower police officers to connect with their communities versus being stuck in front of a computer screen doing data entry. We also believe AI research and technology for use in law enforcement must be done ethically and with the public in mind. This is why we’ve created the AI Ethics Board – to ensure any AI technology in public safety is developed responsibly,” said Axon CEO and founder, Rick Smith.
Despite such an effort, the American Civil Liberties Union and a number of other organizations sent a Letter to Axon AI Ethics Board regarding Ethical Product Development and Law Enforcement outlining the most important concerns regarding the company’s products and policies.
“Law enforcement in this country has a documented history of racial discrimination. Some agencies have routinely and systematically violated human and constitutional rights. Some have harassed, assaulted, and even killed members of our communities. These problems are frequent, widespread, and ongoing,” the letter reads, “Because Axon’s products are marketed and sold to law enforcement, they sometimes make these problems worse. For example, Axon’s body-worn camera systems, which should serve as transparency tools, are now being reduced to powerful surveillance tools that are concentrated in heavily policed communities.”
The letter urged Axon’s AI Ethics Board to take several important points into consideration. These points are:
- Certain products are categorically unethical to deploy.
- Robust ethical review requires centering the voices and perspectives of those most impacted by Axon’s technologies.
- Axon must pursue all possible avenues to limit unethical downstream uses of its technologies.
- All of Axon’s digital technologies require ethical review.
The first point elaborates on the notion that the use of Real-time face recognition may infringe “constitutional freedoms of speech and association, especially at political protests,” and the fact that face recognition tech isn’t perfectly reliable and its accuracy rates vary depending on the race and gender of the subject. Thus, such systems may misidentify individuals and mark innocent people as suspects.
The second point of the letter urged the Board to “invite, consult, and ultimately center in its deliberations the voices of affected individuals and those that directly represent affected communities.” Considering the fact that Axon’s AI Ethics Board includes “well-respected academics, practitioners, advocates, and law enforcement representatives,” but not the people from the communities affected by mass incarceration and the violence of law enforcement, its ethics and decisions will be biased.
The third point emphasized on the notion that Axon’s product design decisions meant to prevent unethical uses will not be enough. The letter called the Board to propose “novel ways to limit unethical uses of Axon’s products,” such as developing specific contractual terms and vital policy safeguards for their customers to comply with. And if it is too hard to limit unethical uses of a certain product, the authors urged the Board to “recommend against the development or sale of that product.”
Finally, the fourth point of the letter urged the Board to examine “all of Axon’s current and future digital products,” as they may act as data sources for future AI solutions and “implicate independent ethical concerns.” As an example, the letter took Axon Citizen and Evidence.com, the system for whistleblowing and a centralized repository of digital evidence, respectively. Both products, the authors note, can affect privacy and amplify discriminatory behaviour if not used responsibly.
As Axon is creating a massive centralized hub for law enforcement surveillance data, the combination of airborne and wearable cameras, CCTV, IT systems, and other tech is becoming a sole must-use product for law enforcement agencies far and wide. Apart from the chances to monopolize its entire market, the company is getting its hands on the rapidly growing pool of data that can be invaluable for training its future AI systems. And the bigger it gets, the worse are the potential consequences of possible misuse.
Of course, the ability to prevent crime and solve complicated cases with more efficiency is a huge step forward towards the safe and secure environment. Similarly, it would be nice to have the ability to prove one’s guilt or innocence with just a couple clicks, and save people’s lives by launching police drones to an ongoing crime scene.The future is nigh and the surveillance tech will undoubtedly have its place there. But no device or code can be 100% reliable, and when it comes to law and order, a mistake may just cost too much.
Follow us on Twitter to stay tuned on the recent developments in regulation of new technologies, and be the first to read expert opinions.