Published: 
By  Audra Book

UPDATE: In April, the team was awarded a new $1.1 million grant from National Institute of Standards and Technology's Innovation Accelerator Program to develop an expanded application of CognitiveEMS that gathers multimodal sensor data from the incident scenes and interacts with a team of responders through augmented reality devices. George Stephens has been the chief of the North Garden Volunteer Fire Company for 20 years, and a firefighter even longer. He joined the company as a young man after he watched firemen struggling with inadequate tools and unreliable water resources try to salvage a house. “Despite their heroic efforts, the house burned to the ground. It was the home of the custodian at my elementary school,” he said. “I realized I had just watched on the sidelines. That day I asked myself what I was doing to help and got involved.” A lot has changed since he first volunteered. “Initially we only responded to fires, but over time we were responding to more car accidents,” he said. “We decided to get medical certifications because we wanted to do more for our residents than just hold their hand and comfort them until the ambulance arrived.” Yearly calls have more than doubled since the early-1990s, from 300 to over 800 in a typical year. Today, close to two-thirds of distress calls are for medical emergencies or car crashes. So, when a team of University of Virginia School of Engineering faculty approached Stephens to see if the company would collaborate on development of a new technology that could revolutionize first responders' work, he didn't hesitate. Homa Alemzadeh, assistant professor in the Charles L. Brown Department of Electrical and Computer Engineering, is lead researcher on the new technology. Her collaborators are John A. Stankovic, BP America Professor of Computer Science and director of UVA Engineering's Link Lab for cyber-physical systems, and Ronald D. Williams, associate professor of electrical and computer engineering.