Electrical and Computer Engineering Location: Thornton Hall E 316
Add to Calendar 2018-11-09T12:00:00 2018-11-09T12:00:00 America/New_York ECE Department Seminar: Tom Fletcher The Riemannian Geometry of Deep Learning  Thornton Hall E 316

The Riemannian Geometry of Deep Learning 

Abstract: Deep generative models learn a mapping from a low-dimensional latent space to a high-dimensional data space. Under certain regularity conditions, these models parameterize nonlinear manifolds in the data space. In this talk, I will present work investigating the Riemannian geometry of these generated manifolds. First, we develop efficient algorithms for computing geodesic curves, which provide an intrinsic notion of distance between points on the manifold. Second, we develop an algorithm for parallel translation of a tangent vector along a path on the manifold. We show how parallel translation can be used to generate analogies, i.e., to transport a change in one data point into a semantically similar change of another data point. Our experiments on real image data show that the manifolds learned by deep generative models, while nonlinear, are surprisingly close to zero curvature. I will discuss the practical impact of this fact and hypothesize why it might be the case. As time permits, I will briefly cover recent work on applications of Riemannian geometry to adversarial examples, i.e., how to slightly manipulate data to “fool” a deep learning algorithm.