Bio

B.S. ​Computer Science, University of Iowa, 1998M.S. Industrial Engineering, University of Iowa, 2001Ph.D. ​​Industrial Engineering, University of Iowa, 2005

"My research interests surround the interface of people and machines - focusing in particular on the sense of touch. My group mixes computational modeling efforts with the design and prototyping of devices with which people directly interact."

GREG GERLING, ASSOCIATE PROFESSOR

I am an Associate Professor in Systems and Information Engineering, and Biomedical Engineering, at the University of Virginia. My group's research interests are in general related to the fields of haptics, computational neuroscience, human factors and ergonomics, biomechanics, and human–machine interaction. My primary domain is that surrounding human health. Our research is highly collaborative and interdisciplinary. My lab builds and analyzes computational models using solid mechanics, differential equations and statistical techniques, designs and prototypes devices using electronics, software and silicone-elastomers, and conducts psychophysical experiments.  I have a substantial background in the skin and receptor physiology related to touch sensation – specifically mechanosensitive peripheral afferents.

Research overview: At points where it is difficult or impossible to directly observe skin mechanics, physiology, or neural responses, computational models have helped elucidate tactile encoding strategies of single peripheral neurons and populations. We have modeled the end organ architecture of the slowly adapting type I afferent, which innervates clusters of Merkel cells, to show how their subgrouping into spike initiation zones can impact the elicited neural response. We have also done systematic measurements of the hyper- and visco-elastic mechanics of the skin, with changes in weight and hair cycle in the mouse. Incorporated together, the models indicate that the skin and end organ receptor structure are intertwined in producing trains of action potentials.  At the behavioral level, a thorough understanding of tactile cues in early, peripheral stages is key to deciphering the whole perceptual chain, as well as engineering sensors and human-machine interfaces.

I have been either the principal investigator on over $4M in university- and federally-funded grants from the National Institutes of Health, DARPA, and other agencies and companies.  I am a senior member of the IEEE and have mentored more than 20 PhD and MS students, and published over 50 journal and conference papers.  I am currently serving as the co-chair of the IEEE Haptics Symposium for 2018 and 2020. Before entering academia, I had industry experience at Motorola, NASA Ames Research Center, and Rockwell Collins; and have consulted with companies in the field of healthcare. I have taught human-machine interaction and user experience design for several years.  In my undergraduate teaching, one of my goals is to work with industry clients to orient students toward acquiring real-world experience.

Research Interests

  • Haptics
  • Human Factors / Human Computer Interaction
  • Computational Neuroscience
  • Computational Systems Biology
  • Biomechanics and Mechanobiology
  • Quantitative Biosciences

Selected Publications

  • Force-rate cues reduce object deformation necessary to discriminate compliances harder than the skin, IEEE Transactions on Haptics, in press Hauser, S. and Gerling, G.J.
  • Computational modeling indicates that surface pressure can be reliably conveyed to tactile receptors even amidst changes in skin mechanics, Journal of Neurophysiology, 116(1):218-228 July 2016. Wang, Y., Baba, Y., Lumpkin, E.A., and Gerling, G.J.
  • Facilitating the collection and dissemination of patient care information for emergency medical personnel, Proceedings of the 2016 IEEE Systems and Information Engineering Design Symposium, University of Virginia, Charlottesville, VA, USA, pp. 239-244 Fleshman, M.A., Argueta, I.J., Austin, C.A., Lee, H.H., Moyer, E.J., and Gerling, G.J.
  • Computation identifies structural features that govern neuronal firing properties in slowly adapting touch receptors, eLife, 3:e01448 2014. Lesniak, D.R., Marshall, K.L., Wellnitz, S.A., Jenkins, B.A., Baba, Y., Rasband, M.N., Gerling, G.J., and Lumpkin, E.A.
  • Validating a population model of tactile mechanotransduction of slowly adapting type I afferents at levels of skin mechanics, single-unit response, and psychophysics, IEEE Transactions on Haptics, 7(2):216-228; Apr-Jun 2014. Gerling, G.J., Rivest, I.I., Lesniak, D.R., Scanlon, J.R., and Wan, L.

Courses Taught

  • SYS3023: Human-Machine Interface
  • SYS6007: Human Factors I
  • SYS4024/6024: User Experience Design
  • SYS6026: Quantitative Models of Human Perceptual Information Processing
  • SYS6064: Applied Human Factors Engineering (Accelerated Masters Program)

Featured Grants & Projects

  • Tying the finger pad's deformation to perception of compliance, or 'softness'

    Tying the finger pad's deformation to perception of compliance, or 'softness'


    Touch interactive displays, such as phones and tablets, have become a normal part of life over the past ten years. To replicate touching another’s hand or tissues in medical exams, the next generation of displays need to relay a sense of compliance, or ‘softness.’ While a number of groups are working to mimic compliant interactions to the finger pad—by employing tilting plates, concentric rings, particle jamming and fabric stretch—these displays do not yet feel natural. To contribute to a gap in the understanding of the biomechanical cues at the finger pad, we have developed a method for 3-D visual observation of the skin’s surface when deformed by transparent compliances. We are studying how the finger pad must deform in space and time to adequately encode the percept of compliance. In particular, time-dependent cues, or information in the rate of change of skin deformation over a spatial field, may be vital to information transfer and mechanical efficiency.

  • Defining the principles that govern the neural output for touch receptors

    Defining the principles that govern the neural output for touch receptors


    Despite past progress, the principles that govern the neural output for touch receptors have not been fully defined. We take an approach of combining computational models with experimental methods. Our work in collaboration of Dr. Ellen A. Lumpkin of Columbia University has sought to define rules in the end organ architecture of the Merkel cells and neurites that are innervated by slowly adapting type I afferents. We have built end organ models to show that the grouping of Merkel cells to heminodes strongly influence the sensitivity function of an afferent. Such computational single-unit models are built for neural afferents, comprising finite elements to capture skin mechanics, differential equations to represent sensory transduction, and integrate-and-fire models to mimic neural dynamics.

  • Compression measurements of skin during development and aging

    Compression measurements of skin during development and aging


    Because the end organs of touch receptors are so intertwined with the skin substrate in which they are embedded, we have performed systematic measurements of the skin stiffness and thickness so vital to constraining our models. These measurements have indicated that there is a systematic thickening of the skin with phases of the mouse hair cycle, as well as other changes to the skin’s time dependent relaxation. These skin changes in conjunction with changes in the end organ could impact homeostatic mechanisms.

  • Computational modeling indicates that surface pressure can be reliably conveyed to tactile receptors even amidst changes in skin mechanics

    Computational modeling indicates that surface pressure can be reliably conveyed to tactile receptors even amidst changes in skin mechanics


    Distinct patterns in neuronal firing are observed between classes of cutaneous afferents. Such differences may be attributed to end organ morphology, distinct ion-channel complements, and skin microstructure, among other factors. As well, even within the class of just slowly adapting type I afferents, neuronal sensitivity is quite variable between units. Do changes in the skin’s mechanics contribute to such variance? Our studies employing finite element analysis suggest that skin can reliably convey indentation magnitude, rate and spatial geometry to the locations of tactile receptors amidst the natural changes in skin mechanics. Indeed, there can be large variance in firing in neural responses amidst skin changes. However, it can be significantly reduced if mechanosensitive afferents are tuned to respond to a stimuli's surface pressure and stress interior to the skin near the locations of the sensory end organs.

  • Medical training simulator design and testing

    Medical training simulator design and testing


    The use of medical simulators has become increasingly widespread as the use of animals and patients for the training of novices has come under scrutiny and as medical errors (44,000-98,000 unnecessary deaths per year) are becoming better understood by the Institute of Medicine. We have worked with clinicians and medical and nursing educators to create human-machine interfaces to train health care practitioners. Specifically, we designed, built, and evaluated a physical-computerized simulators to train clinical palpation skills in breast and prostate screening. The goal is to ensure that clinicians’ skills are systematically trained, time-effective and highly accurate. We built the simulators after collecting measurements of prostate tissue stiffness obtained in the clinic as well as psychophysical evaluation of the tactile detection and discrimination thresholds of those who perform these exams. This work was done in collaboration with Dr. Marcus Martin and Prof. Reba Moyer Childress.

  • User experience design

    User experience design


    I teach courses on human-machine interaction and user experience design. My capstone teams have worked on UX problems for clients in many industries. In general, we design graphical, virtual, and physical user interfaces to display information and afford work processes customized to the mental models, cognitive and perceptual capabilities, and tasks and work domains of end users. An important topic is function allocation in automated systems, as well as interacting with large sets of data.

  • A mobile device to quantify the forces and angles required to overcome knee arthrofibrosis in the clinic

    A mobile device to quantify the forces and angles required to overcome knee arthrofibrosis in the clinic


    Knee arthrofibrosis, a condition where scar tissue limits motion of the knee, is encountered after 4 - 35% of ligament surgeries. We designed, built and evaluated a means of quantifying the forces and angles required to overcome arthrofibrosis during knee manipulation. To quantify relevant cues, the device was built to be mobile and attach to a common knee brace. Fixtures were designed to afford normal physician postures and manual control of torque. The device utilizes load cells, rotary potentiometers, 3D printed parts and integrated circuits. Graphical user interfaces on a mobile phone allow in-the-loop evaluation and a database was used for subsequent, offline analysis. The device is being tested in the clinic with our collaborator Dr. Seth Yarboro.