High Dimensional Classification with Invariant Deep Networks
Presenter(s)
Presenter Profile Picture
Stéphane Mallat

2013 ISIT Plenary Lecture
High Dimensional Classification with Invariant Deep Networks 
Stéphane Mallat
École Normale Supérieure

Abstract

Intra-class variability is the curse of most high-dimensional classification problems. Fighting it means finding discriminative invariants. Classical mathematical invariants are either non-stable to signal variabilities or loose too much information. Surprisingly, non-linear deep neural networks became "hot" again, by accumulating experimental successes over a wide range of applications for speech, images and biological data. We show that such architectures build hierarchical invariants over cascades of Lie groups, which reduce signal variabilities while preserving discrimination. Invariants are computed with filters corresponding to wavelets defined on each group. They are learned from unsupervised data with sparse representation strategies, that remain to be understood. Applications will be discussed and shown on images and sounds. 

 

Biography
Stéphane Mallat received the Ph.D. degree from the University of Pennsylvania, in 1988.  He was then Professor at the Courant Institue of Mathematical Sciences until 1995, Professor at Ecole Polytechnique in Paris, CEO of a start-up semi-conductor company, and is now Professor at École Normale Supérieure in Paris. Stéphane Mallat's research interests include signal processing, harmonic analysis and learning.