This op-ed appeared in Times Higher Education on Sep. 28, 2019

Over the past half-century, biomedical engineers have helped surmount some of medicine’s greatest challenges, designing artificial limbs and organs, next-generation imaging machines, and drug delivery systems to combat many diseases and extend our life spans. By bridging the biomedical and engineering disciplines, biomedical engineers have not only been able to develop powerful solutions – but do so with a deep understanding of the physiological problems that they are designed to tackle.

Now, as we continue to navigate the 21st century and hand over the keys of medical advancement to the next generation of biomedical engineers, we must realise that they face even more intractable challenges, ranging from cancer and diabetes to dementia. To surmount these obstacles, they need to adopt a new approach, rooted in data science, quantitative analysis, and systems modeling.

Over the past decade, we have seen major advances in our ability to acquire data on human health – and a corresponding need to understand and utilise these data in ways that improve health care in America. 

"At UVA, we are acting on the imperative to promote data-driven, multidisciplinary, and modeling approaches across our undergraduate and graduate programs, recognizing that our future biomedical engineers need these opportunities to thrive in a rapidly transforming field."

Fred Epstein, Professor and Chair

Our biomedical engineers now have the ability to explore how systems of cells operate; mine massive, complex datasets in our nation’s health records; utilise omics data (such as genomics and proteomics) to optimise patient-specific therapies; apply data science to medical imaging; and use cutting edge modeling techniques to explore the basic biochemistry related to cancer, diabetes, and other diseases.

Yet, many biomedical engineering departments across the country offer curricula that do not sufficiently prepare graduates to tap into these transformative capabilities. Our future professionals largely are not able to fully access foundational data science, systems modeling, and machine learning courses, even as an increasing number of jobs require these core competencies.

Earlier this summer, GE Healthcare posted 44 job openings for data scientists. Verily recently partnered with Google to establish a life sciences division, striving to leverage deep learning technology to study massive biological datasets related to disease processes. 

And companies such as Medtronic and Novartis are racing to recruit the staff who can harvest the power of big data to inform their next waves of medical devices and drugs. Employment in the field is on pace to expand by 23 percent in the next five years in the US, a faster rate of growth than any other occupation in the country.

Biomedical engineering departments across the country must make changes if they hope to address this demand for a new crop of engineers who are fluent in biomedical data science. 

At the University of Virginia Schools of Engineering and Medicine, we have made a commitment to hiring staff who work across disciplines and engage in research that builds on a deep-rooted, mathematical and computational approach to biomedical systems.

Beyond leveraging the power of data science in their research programs, the 100 biomedical engineering departments across the country can structure curricula that emphasise linear algebra, statistics, systems modeling, signal processing, machine learning and deep learning, alongside cell and molecular biology and physiology. They can embed core data science within biological concepts across their students’ curricular pathways.

At UVA, we are acting on the imperative to promote data-driven, multidisciplinary, and modeling approaches across our undergraduate and graduate programs, recognising that our future biomedical engineers need these opportunities to thrive in a rapidly transforming field.

Universities do not have to start this work from scratch. For example, the National Institutes of Health have already established a series of grant programs for PhD students that encourage the intersection of biomedical engineering and data science.

Meanwhile, the National Science Foundation has launched field-wide dialogues on how to model biological systems and draw correlations between different datasets.

In addition to building on government-driven initiatives, schools of engineering can partner with some of the leading employers in this space to help structure curricula aligned to the skills they need, offer hands-on, project-based training opportunities, and create pipelines to the jobs of tomorrow. 

It is a win-win-win: for the crop of healthcare and technology companies that need engineers trained with these skills and knowledge, for biomedical engineering departments that are better positioned to solve today’s medical challenges, and for our nation.

As we continue to develop biomedical technologies that generate ever more personal data, we must also consider data security and cyber threats. By equipping our future biomedical engineers with the tools to understand, navigate and leverage big data, we are well-positioned to launch – and help protect – the next wave of transformative, secure discoveries in pharmaceuticals, healthcare, and medical imaging.

In doing so, they can take the baton and surmount this century’s greatest medical challenges – from curing cancer and ending diabetes to preventing Alzheimer’s disease.

Fred Epstein says BME departments need to rework their curriculum to create data-driven biomedical engineers

Fred Epstein is a professor of BME and chair of the department. He develops magnetic resonance imaging techniques for assessing the structure, function, and perfusion of the cardiovascular system, particularly in the setting of cardiovascular disease, diabetes, and musculoskeletal disease.