When a lone gunman began firing into a crowd at a Las Vegas country music festival in October 2017, dozens of attendees captured the scene with cellphone video and posted it online.
Capturing mass shootings and other violent acts has become the norm through devices and video platforms, and now Carnegie Mellon University researchers have created a system that uses those videos to accurately locate the source of gunshots using sound and location tracking from the recording device.
It’s not designed to be used in real time, but rather as a system of documentation. It’s similar to the Shot Spotter technology used by the City of Pittsburgh. That technology relies on audio from microphones mounted on buildings.
CMU’s Video Event Reconstruction Analysis, or VERA, system isolates audio from video captured with arbitrary – sometimes roaming – cameras.
Professor Alex Hauptmann used video from the Las Vegas shooting posted on YouTube and Twitter to showcase the system’s capabilities.
It uses triangulation to find the gunshots based on the camera’s location. Using the Las Vegas test VERA correctly estimated the shooter’s location in the north wing of the Mandalay Bay Hotel.
“With some manual verification [we can] figure out where a handful of cameras were at the time they were recording and then after synchronizing the sounds just run it through the system to calculate their differences in time,” Hauptmann said.
The system looks at the time delay between the sounds created by a bullet being fired.
"When we began, we didn't think you could detect the crack with a smartphone because it's really short," Hauptmann said. "But it turns out today's cell phone microphones are pretty good."
He said VERA won’t replace systems like Shot Spotter, but could supplement the work of human rights advocates or journalists investigating or reconstructing terrorist acts, war crimes or human rights violations.
“There is a sense that it can be used both by the authorities but also by people who are looking for the truth in whatever official report comes out,” he said.
It could also help to automate what can be traumatizing work for people.
“The hope is that as we improve our tools and make them more suitable for various types of analysis that they want to do, we can reduce the amount of gory video you have to watch,” he said.
Hauptmann’s fellow researcher Jay D. Aronson, a professor of history at CMU and director of the school’s Center for Human Rights Science said military and intelligence agencies are already developing these types of technologies.
"We think it's crucial for the human rights community to have the same types of tools. It provides a necessary check on state power,” Aronson said.
The researchers recently released VERA as open-source code at a conference in France. Hauptmann said he made the software free to get it to people who need it as quickly as possible.
WESA receives funding from Carnegie Mellon University.