Computer Science Location: Rice 514
Add to Calendar 2022-12-21T10:00:00 2022-12-21T10:00:00 America/New_York Ph.D. Qualifying Exam Presentation by Yinzhu Jin Measuring Feature Dependency of Neural Networks by Collapsing Feature Dimensions in the Data Manifold Abstract:   Rice 514

Measuring Feature Dependency of Neural Networks by Collapsing Feature Dimensions in the Data Manifold

Abstract:  

Neural networks, despite of their high performance, are considered to be black boxes. In some applications like medical imaging, it would be helpful to know if the model is using certain meaningful features. I propose to measure feature dependency of neural networks, by collapsing the data along the feature dimensions and observe how that influences the model's performance. Two different methods are attempted to achieve this. The first method learns the data manifold and then moves the data in the data space along feature directions, while projecting the data points back to the data manifold along the process. While this method fails, it provides some intuitions about moving along the data manifold. The second method learns a disentangled VAE that has sepcific latent dimensions trained to correlate with meaningful features. Then the feature dimension can be represented by these latent dimensions. Experiments on several different datasets have been conducted. The results show that this method successfully gives insights into the model's feature dependency.

 

Committee:

  • Miaomiao Zhang, Committee Chair (ECE/CS/SEAS/UVA) 
  • Tom Fletcher, Advisor (ECE, CS/SEAS/UVA) 
  • Yangfeng Ji (CS/SEAS/UVA) 
  • Matthew Dwyer (CS/SEAS/UVA)