ShotSpotter document reveals key human role in AI gunshot tech

Gun Rights

CHICAGO — In more than 140 cities across the United States, ShotSpotter’s artificial intelligence algorithm and intricate network of microphones evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide.

But a confidential ShotSpotter document obtained by The Associated Press outlines something the company doesn’t always tout about its “precision policing system” — that human employees can quickly overrule and reverse the algorithm’s determinations, and are given broad discretion to decide if a sound is a gunshot, fireworks, thunder or something else.

Such reversals happen 10% of the time by a 2021 company account, which experts say could bring subjectivity into increasingly consequential decisions and conflict with one of the reasons AI is used in law-enforcement tools in the first place — to lessen the role of all-too-fallible humans.

People are also reading…

You Might Like







Investigation Tracked ShotSpotter

FILE – A ShotSpotter Dispatch program is in operation within the Fusion Watch department at the Las Vegas Metropolitan Police headquarters in Las Vegas on Wednesday, Jan. 13, 2021. In more than 140 cities across the United States in 2023, ShotSpotter’s artificial intelligence algorithm and its intricate network of microphones evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide. (L.E. Baskow/Las Vegas Review-Journal via AP, File)




“I’ve listened to a lot of gunshot recordings — and it is not easy to do,” said Robert Maher, a leading national authority on gunshot detection at Montana State University who reviewed the ShotSpotter document. “Sometimes it is obviously a gunshot. Sometimes it is just a ping, ping, ping. … and you can convince yourself it is a gunshot.”

Marked “WARNING: CONFIDENTIAL,” the 19-page operations document spells out how employees in ShotSpotter’s review centers should listen to recordings and assess the algorithm’s finding of likely gunfire based upon a series of factors that may require judgment calls, including whether the sound has the cadence of gunfire, whether the audio pattern looks like “a sideways Christmas tree” and if there is “100% certainty of gunfire in reviewer’s mind.”

ShotSpotter said in a statement to the AP that the human role is a positive check on the algorithm and the “plain-language” document reflects the high standards of accuracy its reviewers must meet.

“Our data, based on the review of millions of incidents, proves that human review adds value, accuracy and consistency to a review process that our customers — and many gunshot victims — depend on,” said Tom Chittum, the company’s vice president of analytics and forensic services.

Chittum added that the company’s expert witnesses have testified in 250 court cases in 22 states, and that its “97% aggregate accuracy rate for real-time detections across all customers” has been verified by an analytics firm the company commissioned.

Another part of the document underscores ShotSpotter’s longstanding emphasis on speed and decisiveness, and its commitment to classify sounds in less than a minute and alert local police and 911 dispatchers so they can send officers to the scene.

Experts say the tight time pressure could encourage ShotSpotter reviewers to err in favor of categorizing a sound as a gunshot, even if some evidence for it falls short, potentially boosting the numbers of false positives.

“You’re not giving your humans much time,” said Geoffrey Morrison, a voice-recognition scientist based in Britain who specializes in forensics processes. “And when humans are under great pressure, the possibility of mistakes is higher.”







Investigation Tracked ShotSpotter

FILE – ShotSpotter equipment overlooks the intersection of South Stony Island Avenue and East 63rd Street in Chicago on Tuesday, Aug. 10, 2021. In more than 140 cities across the United States in 2023, ShotSpotter’s artificial intelligence algorithm and its intricate network of microphones evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide. (AP Photo/Charles Rex Arbogast, File)




ShotSpotter says it published 291,726 gunfire alerts to clients in 2021. That same year, in comments to AP appended to a previous story, ShotSpotter said more than 90% of the time its human reviewers agreed with the machine classification, but the company invested in its team of reviewers “for the 10% of the time where they disagree with the machine.” ShotSpotter did not respond to questions on whether that ratio still holds true.

ShotSpotter’s operations document, which the company argued in court for more than a year was a trade secret, was recently released from a protective order in a Chicago court case in which police and prosecutors used ShotSpotter data as evidence in charging a Chicago man with murder in 2020 for allegedly shooting a man inside his car. Michael Williams spent nearly a year in jail before a judge dismissed the case because of insufficient evidence.

Evidence in Williams’ pretrial hearings showed ShotSpotter’s algorithm initially classified a noise picked up by microphones as a firecracker, making that determination with 98% confidence. But a ShotSpotter reviewer who assessed the sound quickly relabeled it as a gunshot.

The Cook County Public Defender’s Office says the operations document was the only paperwork ShotSpotter sent in response to multiple subpoenas for any guidelines, manuals or other scientific protocols. The publicly traded company has long resisted calls to open its operations to independent scientific scrutiny.

Fremont, California-based ShotSpotter acknowledged to AP it has other “comprehensive training and operational materials” but deems them “confidential and trade secret.”

ShotSpotter installed its first sensors in Redwood City, California, in 1996, and for years relied solely on local 911 dispatchers and police to review each potential gunshot until adding its own human reviewers in 2011.

Paul Greene, a ShotSpotter employee who testifies frequently about the system, explained in a 2013 evidentiary hearing that staff reviewers addressed issues with a system that “has been known from time to time to give false positives” because “it doesn’t have an ear to listen.”

“Classification is the hardest element of the process,” Greene said in the hearing. “Simply because we do not have … control over the environment in which the shots are fired.”







Investigation Tracked ShotSpotter

FILE – Law enforcement personnel use an interactive electronic map for the ShotSpotter Dispatch program running within the Fusion Watch department at the Las Vegas Metropolitan Police Headquarters Wednesday, Jan. 13, 2021, in Las Vegas. In more than 140 cities across the United States in 2023, ShotSpotter’s artificial intelligence algorithm and its intricate network of microphones evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide. (L.E. Baskow/Las Vegas Review-Journal via AP, File)




Greene added that the company likes to hire ex-military and former police officers familiar with firearms, as well as musicians because they “tend to have a more developed ear.” Their training includes listening to hundreds of audio samples of gunfire and even visits to rifle ranges to familiarize themselves with the characteristics of gun blasts.

As cities have weighed the system’s promise against its price tag — which can reach $95,000 per square mile per year — company employees have explained in detail how its acoustic sensors on utility poles and light posts pick up loud pops, booms or bangs and then filter the sounds through an algorithm that automatically classifies whether they’re gunfire or something else.

But until now, little has been known about the next step: how ShotSpotter’s human reviewers in Washington, D.C., and the San Francisco Bay area decide what is a gunshot versus any other noise, 24 hours a day.

“Listening to the audio downloads are important,” according to the document written by David Valdez, a former police officer and now-retired supervisor of one of ShotSpotter’s review centers. “Sometimes the audio is compelling for gunfire that they may override all other characteristics.”

However, a recent study found humans tend to overestimate their abilities to identify sounds.

The 2022 study published in the peer-reviewed journal Forensic Science International looked at how well human listeners identified voices compared to voice-recognition tools. It found all the human listeners performed worse than the voice system alone, saying the findings should lead to the elimination of human listeners in court cases whenever possible.

“Would that be the case with ShotSpotter? Would the ShotSpotter system plus the reviewer outperform the system alone?” asked Morrison, who was one of seven researchers who conducted the study. “I don’t know. But ShotSpotter should do validation to demonstrate that.”

You Might Like

Articles You May Like

Letters: Dave Yost earned his benefits. Return bones to ancestors.
Minnesota: Gun Control Bills Committee Hearing on Friday!
Senate Republicans Propose Concealed Carry Reciprocity Law
Feedback: Solid judgment of ‘Night Court’
Updates to ATF Final Rule on Stabilizing Braces

Leave a Reply

Your email address will not be published. Required fields are marked *