236862 – Course Webpage – Winter Semester 2020/2021

Sparse and Redundant Representations

and Their Applications in Signal and Image Processing

(236862)

Winter Semester, 2020/2021

Note: This course will be given in Hebrew, and in an online (zoom meetings) format

Lecturer: Michael Elad (reception hours: anytime, but please set it up by an email)
Credits: 3 Points
Time & Place: Thursday, 10:30-12:30

Zoom link for ALL the meetings: https://technion.zoom.us/j/96133553794

Prerequisites:
  1. Numerical Algorithms (234125) + Elementary Signal/Image Processing (236200) or
  2. Electrical Engineering’s course on Image Processing (046200)

Graduate students are not obliged to these requirements

Literature: Recently published papers and my book “Sparse and Redundant Representations – From Theory to Applications in Signal and Image Processing” that can be found in the library

Course Description

­­­­­

Few years ago we worked hard to convert this advanced course into a MOOC (Massive Open Online Course) serviced through EdX. This means that all the material we cover can be taught through short videos, interactive work, wet projects, and more, all through the Internet. By the way, you are most welcome looking at the recorded videos on YouTube (Part 1, Part 2) in order to get a feeling for this course’ content.  On October 24th 2020 we start formally this two-part course, and it will be open in parallel to anyone interested, all around the world.

What about you – Technion’s students? A major part of this (236862) course will be taught as the above described MOOC, augmented by mandatory weekly meetings for discussions (we are yet to decide whether these will be frontal or zoom-based). This means that a major part of your work will be done independently through the Internet. This will cover 50% of your activity in our course. As for the rest 50%, it includes weekly meetings and an additional research project assignmentMore explanations on this special structure will be given in the beginning of the semester.

Course Content

In the fields of signal and image processing and machine learning there is a fascinating new arena of research that has drawn a lot of interest in the past ~15 years, dealing with sparse and redundant representations. Once can regard this branch of activity as a natural continuation to the vast activity on wavelet theory, which thrived in the 90’s. Another perspective – the one we shall adopt in this course – is to consider this developing field as the emergence of a highly effective model for data that extends and generalizes previous models. As models play a central role in practically every task in data processing, the effect of the new model is far reaching. The core idea in sparse representation theory is a development of a novel redundant transform, where the number of representation coefficients is larger compared to the signal’s original dimension. Alongside this “waste” in the representation comes a fascinating new opportunity – seeking the sparsest possible representation, i.e., the one with the fewest non-zeros. This idea leads to a long series of beautiful (and surprisingly, solvable) theoretical and numerical problems, and many applications that can benefit directly from the new developed theory. In this course we will survey this field, starting with the theoretical foundations, and systematically covering the knowledge that has been gathered in the past years. This course will touch on theory, numerical algorithms, applications in image processing, and connections to machine learning and more specifically to deep learning.

Course Requirements

  • Note that the course has a very unusual format (MOOC + meetings + a final project).
  • There will be 5 wet HW assignments within the EdX course and various quizzes. The wet HW will concentrate on Matlab/Python implementation of algorithms that will be discussed in class.
  • The course requirements include a final project to be performed by pairs (or singles) based on my own recently published papers. So, just look at my web-page and come to discuss your choice with me. Those of you who are interested in other papers are welcome to approach me and discuss it. Instructions on how to proceed are GIVEN HERE. The project will include
    1. A final report (10-20 pages) summarizing these papers, their contributions, and your own findings (open questions, simulation results, etc.).
    2. A Power-point presentation of the project to present at the end of the semester to the lecturer.
    3. The (hard!!) deadline for the project submission is May 2nd 2021. No delays will be allowed.
  • Attendance of the meetings in (possibly virtual) class is very recommended but not mandatory.

Grading

  • 50% – MOOC grade
  • 50% – Project (content, presentation, & report)

 For those interested (applies to Technion’s students only):

  • Free listeners are welcome.
  • Please send (to both elad@cs.technion.ac.il & zadneprovski@gmail.com) an email so that we add you to the course mailing list, and give you free access to the EdX system.
  • Just show up to the first meeting on October 22nd (link given above) and we’ll take it from there.

Course Material

Course 1:

22.10.20: Welcome & Introduction Video Link
29.10.20: Section 1 Video Link
05.11.20: Section 2 Video Link
12.11.20: Section 3 Video Link
19.11.20: Section 4 Video Link
26.11.20: Section 5 Video Link

Course 2: [Dates will be filled at a later stage, towards the end of November 2020]

10.12.20: Section 1 and Additional slides (Analysis model) Video Link
24.12.20: Section 2 and Additional slides (Kernel Dictionary Learning) Video Link
31.12.20: Section 3 and additional slides (compression)  Video Link
07.01.21: Section 4 and Additional slides (MMSE via Stochastic Resonance)  Video Link
14.01.21: Section 5 and Additional slides (CSC and Deep-Learning) Video Link

Announcements (newest on top)

  • November 19, 2020: Please see this file for instructions regarding the wet excercises
  • October 12, 2020: Curently, the plan is to give this course fully online. The zoom link for the meeting is THIS