236862 – Course Webpage – Winter Semester 2019/2020

Sparse and Redundant Representations

and Their Applications in Signal and Image Processing

(236862)

Winter Semester, 2019/2020

Note: This course will be given in English if foreign students participate in class

 

Lecturer: Michael Elad (reception hours: anytime, but please set it up by an email)
Credits: 3 Points
Time & Place: Thursday, 10:30-12:30, Room: Taub 701 (seventh floor) 
Prerequisites:
  1. Numerical Algorithms (234125) + Elementary Signal/Image Processing (236200) or
  2. Electrical Engineering’s course on Image Processing (046200)

Graduate students are not obliged to these requirements

Literature: Recently published papers and my book “Sparse and Redundant Representations – From Theory to Applications in Signal and Image Processing” that can be found in the library

Course Description

­­­­­

Few years ago we worked hard to convert this advanced course into a MOOC (Massive Open Online Course) serviced through EdX. This means that all the material we cover can be taught through short videos, interactive work, wet projects, and more, all through the Internet. By the way, you are most welcome seeing the recorded videos on YouTube (Part 1, Part 2).  On October 24th 2019 we start formally this two-part course, and it will be open in parallel to anyone interested, all around the world.

What about you – Technion’s students? A major part of this (236862) course will be taught as the above described MOOC, augmented by mandatory weekly meetings for discussions. This means that a major part of your work will be done independently through the Internet. This will cover 50% of your activity in our course. As for the rest 50%, it includes weekly meetings and an additional research project assignmentMore explanations on this special structure will be given in the beginning of the semester.

Course Content

In the fields of signal and image processing and machine learning there is a fascinating new arena of research that has drawn a lot of interest in the past ~15 years, dealing with sparse and redundant representations. Once can regard this branch of activity as a natural continuation to the vast activity on wavelet theory, which thrived in the 90’s. Another perspective – the one we shall adopt in this course – is to consider this developing field as the emergence of a highly effective model for data that extends and generalizes previous models. As models play a central role in practically every task in data processing, the effect of the new model is far reaching. The core idea in sparse representation theory is a development of a novel redundant transform, where the number of representation coefficients is larger compared to the signal’s original dimension. Alongside this “waste” in the representation comes a fascinating new opportunity – seeking the sparsest possible representation, i.e., the one with the fewest non-zeros. This idea leads to a long series of beautiful (and surprisingly, solvable) theoretical and numerical problems, and many applications that can benefit directly from the new developed theory. In this course we will survey this field, starting with the theoretical foundations, and systematically covering the knowledge that has been gathered in the past years. This course will touch on theory, numerical algorithms, applications in image processing, and connections to machine learning and more specifically to deep learning.

Course Requirements

  • Note that the course has a very unusual format (MOOC + mandatory meetings + a final project).
  • There will be 5 wet HW assignments within the EdX course and various quizzes. The wet HW will concentrate on Matlab/Python implementation of algorithms that will be discussed in class.
  • The course requirements include a final project to be performed by pairs (or singles) based on recently published papers. Every year we publish a list of relevant papers for you to choose from for these projects. This year, we will focus on my own papers (published in the past 5 years), so just look at my web-page and come to discuss your choice with me. Those of you who are interested in other papers are welcome to approach me and discuss it. Instructions on how to proceed are shared with you as well. The project will include
    1. A final report (10-20 pages) summarizing these papers, their contributions, and your own findings (open questions, simulation results, etc.).
    2. A Power-point presentation of the project to present at the end of the semester to the lecturer.
    3. The (hard!!) deadline for the project submission is May 2nd 2020. No delays will be allowed.
  • Attendance of the meetings in class is MANDATORY, and will be checked. Each student can skip two meetings without affecting his/her grade. Beyond this, every absence would cost 5% in the final grade.

Grading

  • 50% – MOOC grade
  • 50% – Project (content, presentation, & report)
  • Note again: Class attendance is mandatory (up to 2 absences are permitted without any harm, and beyond that each causes a decrease of 5% in the final grade)

 For those interested (applies to Technion’s students only):

Course Material

Course 1:

24.10.19: Welcome & Introduction
07.11.19: Section 1
14.11.19: Section 2
21.11.19: Section 3
28.11.19: Section 4
05.12.19: Section 5

Course 2:

12.12.19: Section 1 and Additional slides (Analysis model)
22.12.19: Section 2 and Additional slides (Graph-signals)
02.01.20: Section 3 and additional slides (compression) 
16.01.20: Section 4 and Additional slides (MMSE via Stochastic Resonance) 
23.01.20: Section 5 and Additional slides (The CSC model and relation to Deep-Learning)

Announcements (newest on top)

  • October 29th 2019: Every year we publish a list of relevant papers for you to choose from for your projects. This year, we will focus on my own papers (published in the past 5 years), so please look at my web-page and come to discuss your choice with me.
  • Notice the change of dates for our coming meetings. Specifically, the 22nd of December is a Sunday, but its teaching schedule is set as Thursday, and we will hold a meeting that day.