A majority of Axon’s AI Ethics Board resigned yesterday in protest, following an announcement last week that the company planned to equip drones with tasers and cameras as a way to end mass school shootings.
The company withdrew on Sunday at his proposal, but the damage was done. Axon had asked the advisory board last year, and again last month, to consider a pilot program to equip select police departments with Taser drones. A majority of the AI Ethics Board, made up of AI ethics experts, law professors, and advocates of police reform and civil liberties, opposed both. Advisory board chairman Barry Friedman told WIRED that Axon has never asked the group to review any scenario involving schools, and that launching the pilot program without addressing previously expressed concerns is dismissive of the board and the established process. .
In a joint resignation letter made public today, nine members of the AI Ethics Board said the company appeared to be dealing with the tragedy of the recent mass shootings in Buffalo and Uvalde, Texas. Despite citing both mass shootings in a press release announcing the pilot project, Axon CEO Rick Smith denied allegations that the company’s proposal was opportunistic in a Reddit AMA. Smith said a Taser drone could last for years, but he envisions 50 to 100 Taser drones in a school run by trained staff. Before Axon paused the pilot project, Freidman called it a “ill-conceived idea” and said that if the idea is unlikely to come to fruition, Axon’s pitch “distracts the world from real solutions to a serious problem.”
Another signatory to the letter of resignation, University of Washington law professor Ryan Calo, calls Axon’s idea of testing Taser drones in schools “a very, very bad idea.” A meaningful change to curb gun violence in the United States requires addressing issues such as alienation, racism and widespread access to guns. The deaths of children in Uvalde, Texas, did not happen, Calo says, because the school did not have Tasers.
“If we’re going to tackle the prospect of violence in schools, we all know there are much better ways to do it,” he says.
The board had previously expressed concerns about armed drones can lead to increased police violence, especially in communities of color. A report will be published this autumn containing the evaluation of a pilot program by the advisory council.
The real disappointment, Calo says, isn’t that the company didn’t do exactly what the board advised. It is that Axon announced its Taser drone plans before the board could fully describe its opposition. “Suddenly, out of the blue, the company decided to just leave that process,” he says. “That’s why it’s so discouraging.”
He has a hard time imagining that police or trained personnel in a school will have the situational awareness to handle a Taser drone judiciously. Even if a drone operator were to successfully save the lives of suspects or people in marginalized or vulnerable communities, the technology wouldn’t stay there.