236862 – Course Webpage – Winter Semester 2021/2022
Sparse and Redundant Representations
and Their Applications in Signal and Image Processing
Winter Semester, 2021/2022
1: This course may be given in English (depending on foregign participants).
2. I intend to run a followup course (2 credit points) in the spring Semester that will focus on advanced topics in image processing. This will be given in a seminar format and primarily meant for students who took 236862.
|Lecturer:||Michael Elad (reception hours: anytime, but please set it up by an email)|
|Time & Place:||Thursday, 10:30-12:30|
|Grade Policy:||50% – MOOC grade and 50% – Final exam|
Graduate students are not obliged to these requirements
|Literature:||Recently published papers and my book “Sparse and Redundant Representations – From Theory to Applications in Signal and Image Processing” that can be found in the library|
Few years ago we worked hard to convert this advanced course into a MOOC (Massive Open Online Course) serviced through EdX. This means that all the material we cover can be taught through short videos, interactive work, wet projects, and more, all through the Internet. By the way, you are most welcome looking at the recorded videos on YouTube (Part 1, Part 2) in order to get a feeling for this course’ content. On October 24th 2021 we start formally this two-part course, and it will be open in parallel to anyone interested, all around the world.
What about you – Technion’s students? A major part of this (236862) course will be taught as the above described MOOC, augmented by mandatory weekly meetings for discussions (we are yet to decide whether these will be frontal or zoom-based). This means that a major part of your work will be done independently through the Internet. This will cover 50% of your activity in our course. As for the rest 50%, it includes weekly meetings and an additional research project assignment. More explanations on this special structure will be given in the beginning of the semester.
In the fields of signal and image processing and machine learning there is a fascinating new arena of research that has drawn a lot of interest in the past ~15 years, dealing with sparse and redundant representations. Once can regard this branch of activity as a natural continuation to the vast activity on wavelet theory, which thrived in the 90’s. Another perspective – the one we shall adopt in this course – is to consider this developing field as the emergence of a highly effective model for data that extends and generalizes previous models. As models play a central role in practically every task in data processing, the effect of the new model is far reaching. The core idea in sparse representation theory is a development of a novel redundant transform, where the number of representation coefficients is larger compared to the signal’s original dimension. Alongside this “waste” in the representation comes a fascinating new opportunity – seeking the sparsest possible representation, i.e., the one with the fewest non-zeros. This idea leads to a long series of beautiful (and surprisingly, solvable) theoretical and numerical problems, and many applications that can benefit directly from the new developed theory. In this course we will survey this field, starting with the theoretical foundations, and systematically covering the knowledge that has been gathered in the past years. This course will touch on theory, numerical algorithms, applications in image processing, and connections to machine learning and more specifically to deep learning.
- Note that the course has a very unusual format (MOOC + meetings + a final exam).
- There will be 5 wet HW assignments within the EdX course and various quizzes. The wet HW will concentrate on Matlab/Python implementation of algorithms that will be discussed in class.
- New!! The course requirements include a final exam. This replaces the research projects we use to run.
- Attendance of the meetings in class is mandatory (2 absences are permitted, and beyond these, each leads to 2% loss in the final grade).
- 50% – MOOC grade
- 50% – Final exam
For those interested (applies to Technion’s students only):
- Free listeners are welcome.
- Please send (to both firstname.lastname@example.org & email@example.com) an email so that we add you to the course mailing list, and give you free access to the EdX system.
- Just show up to the first meeting on October 22nd (link given above) and we’ll take it from there.
|22.10.20: Welcome & Introduction||Video Link|
|29.10.20: Section 1||Video Link|
|05.11.20: Section 2||Video Link|
|12.11.20: Section 3||Video Link|
|19.11.20: Section 4||Video Link|
|26.11.20: Section 5||Video Link|
Course 2: [Dates will be filled at a later stage, towards the end of November 2020]
|10.12.20: Section 1 and Additional slides (Analysis model)||Video Link|
|24.12.20: Section 2 and Additional slides (Kernel Dictionary Learning)||Video Link|
|31.12.20: Section 3 and additional slides (compression)||Video Link|
|07.01.21: Section 4 and Additional slides (MMSE via Stochastic Resonance)||Video Link|
|14.01.21: Section 5 and Additional slides (CSC and Deep-Learning)||Video Link|
Announcements (newest on top)
- None so far