Lu Feng Earns Prestigious National Science Foundation Grant to Create Formal Methods that Increase the Safety and Reliability of Human-Cyber-Physical Systems
According to the International Federation of Robotics, 2.4 million industrial robots are operating in factories worldwide in 2020. Human interactions with cyber-physical systems, like robots, drones and self-driving vehicles, will continue to increase dramatically.
Although science fiction often paints a future when robots replace humans, a far likelier scenario involves humans and robots working side-by-side. Referred to by researchers as human-cyber-physical systems, the trend is accelerating in all economic sectors including health care, manufacturing and energy.
The hurdle to that future is humans’ lack of trust of cyber-physical systems. Humans must be assured of safety and reliability before cyber-physical systems will be adopted broadly.
Lu Feng, an assistant professor with joint appointments in the Department of Computer Science and Department of Engineering Systems and Environment at the University of Virginia School of Engineering, recently earned a CAREER AWARD from the National Science Foundation to design rigorous, model-based evaluations of human and cyber-physical system interactions, with the goal of improving systems’ safety and reliability.
Through the CAREER Award program, the National Science Foundation recognizes early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their departments or organizations.
Feng earned her Ph.D. in engineering at Oxford University, where she studied formal methods. She then went to University of Pennsylvania to conduct post-doctoral work in the area of cyber-physical systems. The one certainty in her research was that human operators’ behavior, when interacting with cyber-physical systems such as robots and medical devices, was uncertain.
It was during this time that Feng started to think about how formal methods could be applied to human behavior, and to imagine how valuable this could be in building safety and trust into a human-cyber-physical system.
“Human operators introduce intentions, beliefs and actions that mesh with the system autonomy,” she said. “I believed that formal modeling methods could take into account the uncertainty due to human interactions and build in mathematical precision to account for human behaviors with certainty.”
Once Feng completed her post-doctoral research in 2016, she joined the UVA Engineering faculty to be a part of the school’s new Link Lab, where she would be immersed in an incubator space with other faculty and graduate students devoted to interdisciplinary research in cyber-physical systems. Feng immediately set out to investigate her theory using an autonomous vehicle test case.
It took about a year for Feng to set up a simulator vehicle for the observation of human reactions to autonomy, using sensors that gather biometric data as indicators of trust or lack of trust. She used the simulator to conduct her initial human subject study to investigate how humans’ trust in autonomous driving evolves and the influence of different factors.
That study of just 19 human subjects led to a published dataset of driver behaviors. It also established a baseline from which models of human behavior could be applied to much larger populations, without the need for empirical, human driver test cases.
This preliminary research led to Feng’s CAREER Award, because the National Science Foundation recognizes the need to build more advanced modeling tools around human-autonomous systems. This is an area of cyber-physical systems study that is in its infancy, and one which Feng is helping to define.
With the CAREER Award, Feng will further refine human behavior models using the test case data from her initial study. The research aims to create more powerful models that can predict human and autonomous vehicle interactions that normally could only be obtained from millions of miles of on-the-road, human driving studies.
“Lu is pioneering methods for understanding human behaviors that do not exist today,” said Kevin Skadron, who chairs UVA Engineering’s Department of Computer Science. “These formal models will accelerate insight regarding human trust factors, leading to safer design prototypes in a much shorter timeframe for a wide range of applications, from autonomous cars to robots to medical devices.”
Feng chose the autonomous vehicle case study for her research because of the huge hurdles to adoption based on peoples’ perceptions of safety. Public trust of autonomous vehicles dropped from 71% to 68% following several high-profile autonomous vehicle fatalities in 2018.
Public perception is not improving.
A 2020 survey by the Advocates for Highway & Auto Safety found that 85% of respondents are concerned about sharing the road with a driverless car, while 68% of those surveyed said knowing that companies had to meet minimum safety requirements for their driverless cars before selling them to the public would address their trepidations.
“Establishing safety is the most pressing bottleneck in convincing people to buy self-driving cars and support regulations that allow these vehicles on our highways,” Feng said.
She points out that her models are not intended to replace the human-driver testing that is being conducted by companies like Toyota, which is collaborating with her to promote the design for trustworthy autonomy into future vehicles. Feng meets with her Toyota collaborators monthly to ensure that the research assumptions and criteria are industrially relevant.
“This research complements the empirical testing that automotive companies such as Toyota are conducting in order to create a more complete picture,” she said. “It is not possible for empirical, human driver studies to account for every potential driving scenario. With reliable models that can simulate human-autonomous interactions with certainty, you can run an unlimited number of virtual driving scenarios to supplement the empirical data.”
"UVA has a long history of collaborating with industry partners to tackle grand challenges and create solutions that broadly benefit society,” said Brian Smith, professor and chair of the Department of Engineering Systems and Environment. “I am grateful to Lu for leading this important industry partnership that will be a model for the nation in establishing safer autonomous systems."
Feng’s research will be a valuable resource in the future at the development and design phase. The additional lens through which designers can evaluate a human operator’s reactions and perceptions will lead to autonomous vehicles that can better interact in the human-autonomy relationship.
For example, manufacturers could use this data to determine at what point an autonomous vehicle would invite the human operator to interact or take full control. The system could even talk with humans in ongoing interactions as a baseline to build trust.
The goal is to mimic the human-to-human interactions in trusted relationships. Better understanding in this area will also provide advances in the design of a wide variety of human-cyber-physical systems, like collaborative robots and medical devices. More importantly, these advances will create a bridge to understanding how autonomous systems, without human operators, can act like a human and behave in a trustworthy manner.
Feng will incorporate the research in her “CPS: Formal Methods, Safety and Security” course, part of the cyber-physical systems graduate track being developed by UVA Engineering through funding from the National Science Foundation. She seeks to inspire a diverse group of future researchers through her focus on societal benefit, as well. “Studies have shown that women and minorities in particular become more engaged with problems that have social value,” she said.