Reimagining Alexa and Siri’s Sensors for Helping with Patient Communication


Laura Barnes, professor of systems engineering and associate director of the UVA Link Lab (Photo by Tom Cogill, UVA Engineering)


We all agree that important conversations deserve the utmost consideration in choosing words, tone and content. But these elements become even more significant if the conversation is a health care practitioner telling a patient about a terminal diagnosis or a serious medical condition.

According to the Institute for Healthcare Communication, “research evidence indicates that there are strong positive relationships between a health care team member’s communication skills and a patient’s capacity to follow through with medical recommendations, self-manage a chronic medical condition and adopt preventive health behaviors.”

During a patient visit, doctors and nurses are deeply focused on understanding and treating the complex and sometimes critical physiological aspects of a patient’s condition. Wouldn’t it be useful to have an assistant to help them toggle over to the behavioral aspect of care – their bedside manner?

Laura Barnes, a professor of systems engineering at the University of Virginia’s School of Engineering and Applied Science, teamed up with associate professor of nursing Virginia LeBaron to combine mobile sensing technology, natural language processing algorithms and machine learning to create a communication assessment tool that monitors patient-clinician exchanges.

It all starts with a smartwatch.

“Android and Apple watches have embedded sensors,” Barnes said. “Typically, we use them for navigation or fitness monitoring. Our work reimagines these sensors for use in improving patient care and better understanding patient health. In this case, we’re trying to use them to understand an encounter between a physician and a patient.”

For phase one of the project, the team scoured natural language processing research and created a list of factors proven to constitute good and bad communication. They created metrics that would measure things like the number of words used to indicate exploring, seeking, understanding and empathy. Another metric would measure pauses and open-ended questions — indicators of the patient’s need for empathy — and whether there was a response. Another would tally the amount of medical jargon used.

Barnes’ team at the Sensing System for Health (S2He) Lab had developed an app called SWear that allowed them to tailor the types of data collected through the smartwatch sensors for this phase. The data is then transmitted to the cloud so they can access it from the lab for further analysis.

First, they trend and feature-ize the data according to the communication metrics they developed. Then they use machine learning based on natural language processing algorithms to analyze these features and get a final picture of the interaction.

“We found that our data-capturing and analysis process works quite well for the majority of communication metrics we have identified as important,” Barnes said.

“And since no one person is the same, and no one conversation is the same, each interaction is different,” she continued. “Usually when people think of automation, they think of assembly line, cookie-cutter and ‘one-size-fits-all.’ Ironically, this automated detection and analysis method has the potential to enable more personalized care.”


The prototype wristwatch here, called CommSense, offers meaningful, real-time ways to improve how doctors and nurses talk with their patients. (Photo by Sanjay Suchak, University Communications)


The team plans to submit a grant to further develop and evaluate the new technology in clinical settings and refine the system based on clinician and patient feedback.

“Our collaborators in the medical domains are super inclusive and supportive,” said Zhiyuan Wang, a second-year systems engineering Ph.D. student in Barnes’, lab who is working on the project.

“I found it invigorating to work collaboratively with them to bridge our respective domains and develop new technology. As they had their established knowledge systems and research styles, and we had ours, we got super engaged in intensive collaboration to ensure alignment. While they are dedicated to understanding the patient’s well-being, we focus more on creating and solving engineering problems for them. By joining forces, we can push the boundaries!”

The health care collaborators were surprised that the engineers could quantify empathy and emotions, Wang and his fellow researcher S.M. Nusayer Hassan, discovered.

“When we worked with medical staff to test our script scenarios and validate our metrics and processes, we found that just being exposed to our project made them think about how they are conducting their conversations with patients. The implemented solution should be even better!” Hassan, a systems engineering master’s student, said.

The team will develop a feedback mechanism in the next phase of the project.

“Feedback could eventually be real-time,” Barnes said. “For example, during an interaction, you could provide a vibration – but not a sound so only the clinician knows – to let him or her know to slow down or stop using medical jargon.

“In some fields, like aviation, sensors are used to monitor many different systems on the plane and then give the pilots feedback about what’s going on. We’re doing something similar except from the human interaction standpoint.”

Barnes envisions this tool, called CommSense, being used in extremely sensitive exchanges like end-of-life palliative care. She also hopes the device could be used as a training tool for new physicians and nurses.

“We also want to gather data about body language, heart rate and other physiological metrics to keep building a bigger communication picture and give health care providers more ‘flight controls,’” she said.

Barnes was recently named associate director of the Link Lab, UVA’s highly collaborative, cross-disciplinary center for cyber-physical systems research, which includes strong programs in smart health. A founding member of the research center, she focuses her research on designing mobile and wearable wireless technologies that help to better understand human behavior – and how that information can help improve health. Other projects include managing medication adherence and monitoring and treating anxiety.

As a Link Lab co-leader, Barnes will spearhead University-wide efforts in smart and connected health research. She also will work with Link Lab’s new director, professor of civil and environmental engineering Jon Goodall, on strategic planning for the center.


Barnes' work was made possible by a seed grant from the UVA Center for Engineering in Medicine.