236862 – Course Webpage – Winter Semester 2020/2021
Sparse and Redundant Representations
and Their Applications in Signal and Image Processing
Winter Semester, 2020/2021
Note 1: This course might be given in English if foreign students from within the Technion participate in class
Note 2: Curently, the plan is to give this course fully online
|Lecturer:||Michael Elad (reception hours: anytime, but please set it up by an email)|
|Time & Place:||Thursday, 10:30-12:30 [Room: Taub 6 if frontal classes are held]
Zoom link for the first 6 meetings: https://technion.zoom.us/j/96133553794
Graduate students are not obliged to these requirements
|Literature:||Recently published papers and my book “Sparse and Redundant Representations – From Theory to Applications in Signal and Image Processing” that can be found in the library|
Few years ago we worked hard to convert this advanced course into a MOOC (Massive Open Online Course) serviced through EdX. This means that all the material we cover can be taught through short videos, interactive work, wet projects, and more, all through the Internet. By the way, you are most welcome looking at the recorded videos on YouTube (Part 1, Part 2) in order to get a feeling for this course’ content. On October 24th 2020 we start formally this two-part course, and it will be open in parallel to anyone interested, all around the world.
What about you – Technion’s students? A major part of this (236862) course will be taught as the above described MOOC, augmented by mandatory weekly meetings for discussions (we are yet to decide whether these will be frontal or zoom-based). This means that a major part of your work will be done independently through the Internet. This will cover 50% of your activity in our course. As for the rest 50%, it includes weekly meetings and an additional research project assignment. More explanations on this special structure will be given in the beginning of the semester.
In the fields of signal and image processing and machine learning there is a fascinating new arena of research that has drawn a lot of interest in the past ~15 years, dealing with sparse and redundant representations. Once can regard this branch of activity as a natural continuation to the vast activity on wavelet theory, which thrived in the 90’s. Another perspective – the one we shall adopt in this course – is to consider this developing field as the emergence of a highly effective model for data that extends and generalizes previous models. As models play a central role in practically every task in data processing, the effect of the new model is far reaching. The core idea in sparse representation theory is a development of a novel redundant transform, where the number of representation coefficients is larger compared to the signal’s original dimension. Alongside this “waste” in the representation comes a fascinating new opportunity – seeking the sparsest possible representation, i.e., the one with the fewest non-zeros. This idea leads to a long series of beautiful (and surprisingly, solvable) theoretical and numerical problems, and many applications that can benefit directly from the new developed theory. In this course we will survey this field, starting with the theoretical foundations, and systematically covering the knowledge that has been gathered in the past years. This course will touch on theory, numerical algorithms, applications in image processing, and connections to machine learning and more specifically to deep learning.
- Note that the course has a very unusual format (MOOC + meetings + a final project).
- There will be 5 wet HW assignments within the EdX course and various quizzes. The wet HW will concentrate on Matlab/Python implementation of algorithms that will be discussed in class.
- The course requirements include a final project to be performed by pairs (or singles) based on my own recently published papers. So, just look at my web-page and come to discuss your choice with me. Those of you who are interested in other papers are welcome to approach me and discuss it. Instructions on how to proceed are shared with you as well. The project will include
- A final report (10-20 pages) summarizing these papers, their contributions, and your own findings (open questions, simulation results, etc.).
- A Power-point presentation of the project to present at the end of the semester to the lecturer.
- The (hard!!) deadline for the project submission is May 2nd 2021. No delays will be allowed.
- Attendance of the meetings in (possibly virtual) class is very recommended but not mandatory.
- 50% – MOOC grade
- 50% – Project (content, presentation, & report)
For those interested (applies to Technion’s students only):
- Free listeners are welcome.
- Please send (to both firstname.lastname@example.org & email@example.com) an email so that we add you to the course mailing list, and give you free access to the EdX system.
- Just show up to the first meeting on October 22nd (link given above) and we’ll take it from there.
|22.10.20: Welcome & Introduction|
|29.10.20: Section 1|
|05.11.20: Section 2|
|12.11.20: Section 3|
|19.11.20: Section 4|
|26.11.20: Section 5|
Course 2: [Dates will be filled at a later stage, towards the end of November 2020]
|10.12.20: Section 1 and Additional slides (Analysis model)|
|24.12.20: Section 2 and Additional slides (Graph-signals)|
|31.12.20: Section 3 and additional slides (compression)|
|07.01.21: Section 4 and Additional slides (MMSE via Stochastic Resonance)|
|14.01.21: Section 5 and Additional slides (The CSC model and relation to Deep-Learning)|
Announcements (newest on top)
- October 12, 2020: Curently, the plan is to give this course fully online. The zoom link for the meeting is THIS