Talks

Sparse Modeling in Image Processing and Deep Learning
18 of September 2017
ICIP 2017, in Beijing China

Sparse approximation is a well-established theory, with a profound impact on the fields of signal and image processing. In this talk we start by presenting this model and its features, and then turn to describe two special cases of it: the convolutional sparse coding (CSC) and its multi-layered version (ML-CSC). Amazingly, as we will carefully show, ML-CSC provides a solid theoretical foundation to the field of deep-learning. Alongside this main message of bringing a theoretical backbone to deep-learning, another central message that will accompany us throughout the talk: Generative models for describing data sources enable a systematic way to design algorithms, while also providing a complete mechanism for a theoretical analysis of these algorithms’ performance. This talk is meant for newcomers to this field – no prior knowledge on sparse approximation is assumed.

This is a KEYNOTE talk that was given in ICIP 2017, in Beijing China. This talk summarizes portions of the PhD work by my three PhD students, Vardan Papyan, Yaniv Romano, and Jeremias Sulam.