Hadi Daneshmand
Assistant Professor, Computer Science
About
Before joining UVA, I was a postdoctoral researcher at FODSI, jointly hosted by MIT and Boston University. I completed my PhD in computer science at ETH Zurich. I am developing theoretical guarantees for deep neural networks and studying their mechanisms. While deep nets are often perceived as statistical parametric models, I study neural networks from computational perspectives, linking their feature extraction mechanism to continuous optimization methods.
Education
Ph.D. in Computer Science, ETH Zurich, 2020
Research Interests
Foundations of Machine Learning
Understanding neural networks
Optimization for machine learning
Analyzing stochastic processes with applications in machine learning
Selected Publications
Towards Training Without Depth Limits: Batch Normalization Without Gradient Explosion, ICLR 2024
A. Meterez, A. Joudaki, F. Orabona, A. Immer, G. Ratsch, H. Daneshmand
ABS
Transformers learn to implement preconditioned gradient descent for in-context learning, NeurIPs 2023
K. Ahn, X. Cheng, H. Daneshmand, S. Sra
ABS
Efficient displacement convex optimization with particle gradient descent, ICML 23
H. Daneshmand, J. Lee, C. Jin
ABS
Local saddle point optimization: A curvature exploitation approach, AISTATS 19
L. Adolphs, H. Daneshmand, A. Lucchi, T. Hofmann
ABS
Awards
Spotlight award of ICML workshop on in-context learning
Early postdoc mobility of SNSF
Postdoc fellowship of FODSI
Reviewer awards for ICML 22, 19 and NeurIPs 20