Contact
Location
Olsson Hall 101D
Lab
​Olsson Hall 002
​151 Engineer's Way
Charlottesville, VA 22904
Google Scholar ResearchGate Gerling Touch Lab YouTube

About

I am a Professor of Systems Engineering at the University of Virginia, with courtesy appointments in Mechanical and Biomedical Engineering. My group's research interests in general relate to the fields of haptics, computational neuroscience, human factors and ergonomics, biomechanics, and human–machine interaction. We employ computational models, statistical analysis, data science, and imaging techniques, design and build unique mechanical and electrical devices, and perform psychophysical experiments with human participants. Our research is highly collaborative and interdisciplinary and primarily lies in the domain surrounding human health. 

At present, my work is investigating cutaneous and proprioceptive cues that convey an object’s softness, as well as the neural basis of social and emotional touch. This work is done in close collaboration with neuroscientists, in human and mouse model systems. A thorough understanding of tactile cues in early, peripheral stages is key to deciphering the whole perceptual chain, as well as engineering sensors and human-machine interfaces.

I have been the principal investigator on several federally funded grants from the National Institutes of Health, National Science Foundation, DARPA, and other agencies and companies.  I am a senior member of the IEEE and have mentored more than 20 PhD and MS students and published over 50 journal and conference papers.  I served as the co-chair of the IEEE Haptics Symposium for 2018 and 2020, co-editor of the IEEE World Haptics Conference in 2017 and 2019. I currently serve as the chair of the IEEE Technical Committee on Haptics and as an associate editor of the IEEE Transactions on Haptics. 

Before entering academia, I had industry experience at Motorola, NASA Ames Research Center, and Rockwell Collins. I currently consult on UX/UI design and evaluation in the field of healthcare. I have taught courses in human factors, human-machine interaction, and user experience design for several years.  In my undergraduate teaching, one of my goals is to work with industry clients to orient students toward acquiring real-world experience.

Education

B.S. ​Computer Science, University of Iowa, 1998

M.S. Industrial Engineering, University of Iowa, 2001

Ph.D. ​​Industrial Engineering, University of Iowa, 2005

My research interests surround the interface of people and machines - focusing in particular on the sense of touch. My group mixes computational modeling efforts with the design and prototyping of devices with which people directly interact.

Gregory J. Gerling Professor

Research Interests

Haptics
Human Factors / Human Computer Interaction
Computational Neuroscience
Biomechanics and Mechanobiology
Quantitative Biosciences

Selected Publications

An elasticity-curvature illusion decouples cutaneous and proprioceptive cues in active exploration of soft objects, PLOS Computational Biology, 17(3), 2021. Xu, C., Wang, Y., And Gerling, G.J.
Abstract
Subtle Contact Nuances in the Delivery of Human-to-Human Touch Distinguish Emotional Sentiment, IEEE Transactions on Haptics, 2022 Xu, S., Xu, C., Mcintyre, S., Olausson, H., And Gerling, G.J.
Abstract
Using Digital Image Correlation to Quantify Skin Deformation with Von Frey Monofilaments, IEEE Transactions on Haptics, 2022 Kao, A.r., Xu, C., And Gerling, G.J.
Abstract
The Language of Social Touch Is Intuitive and Quantifiable, Psychological Science, 2022 Mcintyre, S., Hauser, S.C., Kusztor, A., Boehme, R., Moungou, A., Isager, P.M., Homman, L., Novembre, G., Nagi, S.S., Israr, A., Lumpkin, E.A., Abnousi, F., Gerling, G.J., And Olausson, H
Abstract
Computation predicts rapidly adapting mechanotransduction currents cannot account for tactile encoding in Merkel cell-neurite complexes, PLOS Computational Biology, 14(6), 2018. Gerling, G.J.*, Wan, L., Hoffman, B.U., Wang, Y., And Lumpkin, E.A.
Abstract
Computation identifies structural features that govern neuronal firing properties in slowly adapting touch receptors, eLife, 3:e01448 2014. Lesniak, D.R., Marshall, K.L., Wellnitz, S.A., Jenkins, B.A., Baba, Y., Rasband, M.N., Gerling, G.J., And Lumpkin, E.A.
Abstract
Imaging the 3-D deformation of the finger pad when interacting with compliant materials, IEEE Haptics Symposium, 2018 Hauser, S.C. and Gerling, G.J.
Abstract
User Experience Design for Human-Machine Teaming in Commanding a Distributed Constellation of Unmanned Assets in Search and Rescue, Annual Meeting of the Human Factors and Ergonomics Society, 2021 Anderson, T., Fogarty, K., Kenkel, H., Raisigel, J., Zhou, S., Reggia, L.M., Manizade, S.G., Lesniak, D.R., and Gerling, G.J.
Abstract

Courses Taught

SYS3023: Human-Machine Interface
SYS6007: Human Factors I
SYS4024/6024: User Experience Design
SYS6026: Quantitative Models of Human Perceptual Information Processing
SYS6064: Applied Human Factors Engineering Accelerated Masters Program

Featured Grants & Projects

Tying the finger pad's deformation to perception of compliance, or 'softness' Touch interactive displays, such as phones and tablets, have become a normal part of life over the past ten years. To replicate touching another’s hand or tissues in medical exams, the next generation of displays need to relay a sense of compliance, or ‘softness.’ While a number of groups are working to mimic compliant interactions to the finger pad—by employing tilting plates, concentric rings, particle jamming and fabric stretch—these displays do not yet feel natural. To contribute to a gap in the understanding of the biomechanical cues at the finger pad, we have developed a method for 3-D visual observation of the skin’s surface when deformed by transparent compliances. We are studying how the finger pad must deform in space and time to adequately encode the percept of compliance. In particular, time-dependent cues, or information in the rate of change of skin deformation over a spatial field, may be vital to information transfer and mechanical efficiency.
Defining the principles that govern the neural output for touch receptors Despite past progress, the principles that govern the neural output for touch receptors have not been fully defined. We take an approach of combining computational models with experimental methods. Our work in collaboration of Dr. Ellen A. Lumpkin of Columbia University has sought to define rules in the end organ architecture of the Merkel cells and neurites that are innervated by slowly adapting type I afferents. We have built end organ models to show that the grouping of Merkel cells to heminodes strongly influence the sensitivity function of an afferent. Such computational single-unit models are built for neural afferents, comprising finite elements to capture skin mechanics, differential equations to represent sensory transduction, and integrate-and-fire models to mimic neural dynamics.
Compression measurements of skin during development and aging Because the end organs of touch receptors are so intertwined with the skin substrate in which they are embedded, we have performed systematic measurements of the skin stiffness and thickness so vital to constraining our models. These measurements have indicated that there is a systematic thickening of the skin with phases of the mouse hair cycle, as well as other changes to the skin’s time dependent relaxation. These skin changes in conjunction with changes in the end organ could impact homeostatic mechanisms.
Computational modeling indicates that surface pressure can be reliably conveyed to tactile receptors even amidst changes in skin mechanics Distinct patterns in neuronal firing are observed between classes of cutaneous afferents. Such differences may be attributed to end organ morphology, distinct ion-channel complements, and skin microstructure, among other factors. As well, even within the class of just slowly adapting type I afferents, neuronal sensitivity is quite variable between units. Do changes in the skin’s mechanics contribute to such variance? Our studies employing finite element analysis suggest that skin can reliably convey indentation magnitude, rate and spatial geometry to the locations of tactile receptors amidst the natural changes in skin mechanics. Indeed, there can be large variance in firing in neural responses amidst skin changes. However, it can be significantly reduced if mechanosensitive afferents are tuned to respond to a stimuli's surface pressure and stress interior to the skin near the locations of the sensory end organs.
Medical training simulator design and testing The use of medical simulators has become increasingly widespread as the use of animals and patients for the training of novices has come under scrutiny and as medical errors (44,000-98,000 unnecessary deaths per year) are becoming better understood by the Institute of Medicine. We have worked with clinicians and medical and nursing educators to create human-machine interfaces to train health care practitioners. Specifically, we designed, built, and evaluated a physical-computerized simulators to train clinical palpation skills in breast and prostate screening. The goal is to ensure that clinicians’ skills are systematically trained, time-effective and highly accurate. We built the simulators after collecting measurements of prostate tissue stiffness obtained in the clinic as well as psychophysical evaluation of the tactile detection and discrimination thresholds of those who perform these exams. This work was done in collaboration with Dr. Marcus Martin and Prof. Reba Moyer Childress.
User experience design I teach courses on human-machine interaction and user experience design. My capstone teams have worked on UX problems for clients in many industries. In general, we design graphical, virtual, and physical user interfaces to display information and afford work processes customized to the mental models, cognitive and perceptual capabilities, and tasks and work domains of end users. An important topic is function allocation in automated systems, as well as interacting with large sets of data.
A mobile device to quantify the forces and angles required to overcome knee arthrofibrosis in the clinic Knee arthrofibrosis, a condition where scar tissue limits motion of the knee, is encountered after 4 - 35% of ligament surgeries. We designed, built and evaluated a means of quantifying the forces and angles required to overcome arthrofibrosis during knee manipulation. To quantify relevant cues, the device was built to be mobile and attach to a common knee brace. Fixtures were designed to afford normal physician postures and manual control of torque. The device utilizes load cells, rotary potentiometers, 3D printed parts and integrated circuits. Graphical user interfaces on a mobile phone allow in-the-loop evaluation and a database was used for subsequent, offline analysis. The device is being tested in the clinic with our collaborator Dr. Seth Yarboro.