ShotSpotter’s Incident Review Room is like any other call center.
Analysts wearing headphones sit in front of the computer screen, listening intently.
Yet the people who work here have an extraordinary responsibility.
They make the final decision whether a computer algorithm has correctly identified a shot and whether to send the police.
Making the wrong call has serious consequences.
ShotSpotter has garnered a lot of negative comments over the past year. Allegations ranging from its inaccurate technology, to claims that ShotSpotter is fueling discrimination in the police.
In the wake of this negative news, the company gave BBC News access to its national incident review center.
ShotSpotter is trying to solve a real problem.
“What makes the system so compelling, we believe, is that 80-95 percent of the shots go unreported,” says CEO Ralph Clark.
People don’t report the shooting for several reasons: they may not be sure what they heard, think someone else will call 911, or just don’t trust the police.
So ShotSpotter’s founders came up with an idea. What if they could bypass the emergency health process altogether?
They invented a system.
Microphones are attached to structures around a neighborhood. When a loud bang is detected, a computer analyzes the sound and classifies it as a shot or something else. A human analyst then steps in to review the decision.
In the incident review room, former teacher Ginger Ammon allows me to sit with her as she analyzes these decisions in real time.
Whenever the algorithm signals a potential hit, it emits a “ping” sound.
Ms. Ammon first listens to the recording and then studies the waveform it produces on her computer screen.
“We are looking to see how many sensors have detected it and if the sensors have created a directional pattern, because, in theory, a shot can only travel in one direction,” he says.
Once she is sure that a shot has been fired, Ms. Ammon clicks a button that sends police officers to the scene.
It all happens in less than 60 seconds.
“It’s like I’m playing a video game,” I say.
“This is a comment we often get,” he replies.
The successes of ShotSpotter
There are clear examples of how ShotSpotter works.
In April 2017, black supremacist Kori Ali Muhammad went on a killing spree in Fresno, California.
Trying to kill as many white men as possible, he walked into a suburban neighborhood, poking at targets.
Police were getting 911 calls, but they were late and unspecific.
Yet ShotSpotter was able to point out Muhammad’s course to the officers.
After three minutes – and three murders – Muhammad was arrested.
Fresno police believe that without ShotStopper, they would have killed more.
“ShotSpotter has given us the path it has taken,” says Lt. Bill Dooley.
The company has been hugely successful in convincing the police force to adopt its technology.
Its microphones are in more than 100 cities across America and for years the technology was considered incontrovertible.
That all changed with the George Floyd murder, as people became more interested in the technology used by so many police forces.
ShotSpotter is too expensive to be distributed by the police in a city.
Instead, microphones are often placed in downtown areas, areas with a higher black population.
So, if the technology isn’t as accurate as claimed, it could have a disproportionate impact on those communities.
Suddenly, ShotSpotter was in the spotlight.
Concerns about accuracy
ShotSpotter claims to be 97% accurate. This would mean that the police can be pretty sure that when a ShotSpotter warning occurs, they will almost certainly respond to a shot.
But this statement is exactly that, a statement. It’s hard to understand how ShotSpotter knows it’s that accurate, at least not with the public information it has released.
And if it isn’t, it could have far-reaching consequences for American justice.
The first problem with this accuracy statement is that it is often difficult to tell, on the ground, if a shot was fired.
When the Chicago Inspector General investigated, they … found that in only 9% of ShotSpotter’s alerts there was physical evidence of a shot.
“That’s a low number,” says the city’s Deputy Inspector General for Public Safety Deborah Witzburgh.
This means that in 91% of police responses to ShotSpotter’s alerts, it’s hard to say for sure that a gun was fired. That’s not to say there wasn’t a shot, but it’s hard to prove there was.
The shot sounds very similar to a firecracker or the backfire of a car.
So how does ShotSpotter be so sure that it is almost 100% accurate? That’s something I ask of Mr. Clark.
“We rely on the basic truth of agencies to tell us when they are missing, when there are no detections or when we get the classification wrong,” he tells me.
But critics say the methodology has a fundamental flaw. If the police aren’t sure if a gunshot was fired, they won’t tell the company they were wrong.
In other words, critics say, the company has counted “I don’t know”, “maybe” and “probably” as “right”.
Chicago defense attorney Brendan Max says the company’s accuracy claims are “marketing nonsense.”
“Customer feedback is used to decide whether people like Pepsi or Coca Cola best,” he says.
“It is not designed to determine if a scientific method works.”
Conor Healy, who analyzes security systems for the IPVM video surveillance research group, is also deeply skeptical of the 97% accuracy figure.
“Giving the police the burden of reporting every false positive means that they are expected to report on things when nothing has happened … which they are unlikely to do,” says Healy.
“It is fair to assume that if they [ShotSpotter] they have solid test data to back up their claims, they have every incentive to release that data. ”
Armed crime on the rise
Back in Fresno, I join the police on a night tour with police officer Nate Palomino.
Fresno has some of the worst gun crimes in California, and just like many other cities in America it is becoming worse in the last two years.
Sure enough, a ShotSpotter alert comes along. However, when we reach the scene, no shells are found and there is no physical evidence of a shot.
Agent Palomino tells me that the audio recording sounds like a shot – and it seems more than possible that it was – but it’s hard to prove.
He says the scenario is typical.
ShotSpotter’s accuracy should be beyond doubt.
It has been used in courts across the country as evidence for both defense and prosecution.
The concern is that if it’s not as accurate as it’s claimed, ShotSpotter is sending officers into situations that are mistakenly expecting shots.
Alyxander Godwin, who campaigned to get rid of ShotSpotter in Chicago, sums up the concern.
“The police expect these situations to be hostile,” he says.
“They expect there to be a gun, and because of where it is deployed, they expect a person of color or brown to hold a gun.”
But ShotSpotter says there is no data to support this theory.
“What you would describe is a situation where agents arrive at a scene and are practically shooting at unarmed people,” says Clark.
“It’s not just in the data – it’s speculation.”
However, it also seems to accept that the company’s accuracy methodology has its limitations.
“It might be fair to say, ‘Hey, look, you’re not getting all the feedback you could get,'” says Clark.
“It might be a fair criticism.”
Mr. Max, the Chicago attorney, says ShotSpotter’s reports shouldn’t be admitted as evidence in court until the company can better support its claims.
“Over the past four or five months, I’ve been aware of dozens of Chicago citizens who have been arrested based on ShotSpotter’s evidence,” he says.
“I’m sure it has occurred in cities across the country.”
He also says the company should open its systems to better review and analysis.
For example, who independently monitors the quality of analysts? And how often does the algorithm disagree with the human analyst?
Of course, from my time at the ShotSpotter accident review center, it’s common for analysts to disagree with the classification of computers.
“It’s just filtering what we see,” says Ms. Ammon.
“But honestly I don’t even look at it [the classification], I’m so busy looking at the sensor models. ”
It is an interesting admission. Sometimes, technology is seen as everything it sees, everything it knows: the computer masterfully detecting a shot.
In practice, analysts play a much bigger role than I expected.
Lawyers like Brendan Max are interested in trying to establish more information on how technology works in court.
ShotSpotter has received a lot of criticism over the past year, not all of it right.
And much of the coverage casually ignores the fact that law enforcement often gives rave reviews about the technology’s effectiveness.
The company would like to highlight instances where ShotSpotter alerted police to gunshot victims, for example, saving lives.
In several cities across America, activists are trying to persuade cities to withdraw ShotSpotter’s contracts.
But in other places, ShotSpotter is expanding.
In Fresno, police chief Paco Balderrama is looking to increase his coverage, at a cost of $ 1 million (£ 0.7 million) a year.
“What if ShotSpotter only saved one life in a given year? Is it worth a million dollars? I guess it is,” he says.
The ShotSpotter debate is extremely complex and has important potential ramifications for community policing in America.
It is unlikely to go away until the accuracy of the technology is independently verified and the data has been peer-reviewed.
- Artificial intelligence
Spy chiefs warn of cyber attacks from smart cities
- May 7
Read More about Tech News here.
This Article is Sourced from BBC News. You can check the original article here: Source