Published: 
By  Link Lab

Homa Alemzadeh and her research team have come a long way since winning a $1.1 million grant in 2018 to build an artificial intelligence assistant to help first responders make good decisions during emergency situations. As first reported in 2018, the project's goal is to develop a wearable, voice-activated assistant that collects sound data from the incident scene and, in response, provides dynamic feedback that would help the responder by suggesting appropriate medical interventions.“We think we will make it much easier for responders to focus on the information that matters — and in the process save lives,” Alemzadeh said.
Alemzadeh, an assistant professor in the University of Virginia's DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING and a member of UVA Engineering's LINK LAB for cyber-physical systems is collaborating with RONALD WILLIAMS, associate professor of electrical and computer engineering, and JOHN STANKOVIC, BP America Professor of COMPUTER SCIENCE and director of the Link Lab.
The team demonstrated their progress on the project, called CognitiveEMS, during the 2020 National Institute of Standards and Technology Annual Public Safety Broadband Stakeholder Meeting in July. Funding for the research came from the standards institution, known as NIST, which is an agency of the U.S. Department of Commerce.
At the stakeholder meeting, they also demonstrated CognitiveEMS, showing how it collects speech at the scene and responds with suggestions for the most appropriate medical interventions, while also automatically generating an incident report.
Throughout the project, the UVA Engineering researchers have been collaborating with Richmond Ambulance Authority and the North Garden Volunteer Fire Company to test and improve the system.
A recording of the team's July presentation with details of the project to date is available here.