Statistical Pattern Recognition

Prof. Thomas Brox

Statistical pattern recognition, often better known under the term "machine learning", is a key element of modern computer science. Its goal is to find, learn, and recognize patterns in complex data, for example in images, speech, biological pathways, the internet. In contrast to classical computer science, where the computer program, the algorithm, is the key element of the process, in machine learning we have a learning algorithm, but in the end the actual information is not in the algorithm, but in the representation of the data processed by this algorithm.

This course gives an introduction in all tasks of machine learning: classification, regression, and clustering. In the case of classification, we learn a decision function from annotated training examples (e.g., a set of dog and non-dog images). Given a new image, the classifier should be able to tell whether it is a dog image or not. In regression we learn a mapping from an input function to an output function. Again this mapping is learned from a set of input/output pairs. Both classification and regression are supervised methods as the data comes together with the correct output. Clustering is an unsupervised learning method, where we are just given unlabeled data and where clustering should separate the data into reasonable subsets. The course is based in large parts on the textbook "Pattern Recognition and Machine Learning" by Christopher Bishop. The exercises will consist of theoretical assignments and programming assignments in Python.

The content of this course is complementary to the Machine Learning course offered by Joschka Boedecker and Frank Hutter. It absolutely makes sense to attend both courses if you want to specialize in Machine Learning. It also complements the Deep Learning course.

The lecture will be provided as online course. There is recorded class material, which will be augmented by a weekly online meeting in Zoom, which provides additional updates (the state of the art is changing rapidly) and allows you to ask questions about the material. Be aware that the online meetings will not be recorded. The exercises will be also handled online via an online forum, where you can seek the help of other students, and by weekly Zoom meetings, where you can interact with the advisors for the excercises. Access information will be provided in the first lecture week via email. Ensure that you are registered for the course before that week.

Those, who were not registered in time, for whatever reason, can login to the Discussion Forum and find the information there.

Note: This semester we will provide exercise material in Python (Jupyter notebook format). They will be available in this Github repository. All exercises in Python are not available immediately as they are currently under development, but they will be added as the course progresses. Please check the Github repository for updates.

Lecture (Q&A):
(2 SWS)
Thursday, 11:00-12:00
Online meeting in Zoom

(2 SWS)
Monday, 15:00-16:00
Online meeting in Zoom
Contact persons: Max Argus, Tonmoy Saikia
Discussion Forum

Exam: Written exam.

Beginning: Lecture: Thursday, April 22, 2021
Exercises: Monday, April 26, 2021

ECTS Credits: 6

Recommended semester:

1 or 2 (MSc)
Requirements: Fundamental mathematical knowledge, particularly statistics.

Exam: Written exam on 22.9.2021 13:00-14:00.

Remarks: Full completion of all relevant theoretical and programming assignments is highly recommended.

Relevance vector machines

Slides and Recordings

Watch beforeTopicSlidesRecordingsSolutions
22.4. Class 1: Introduction MachineLearning01.pdf MachineLearning01.mp4 ex1.ipynb
29.4.Class 2: Probability distributions MachineLearning02.pdf MachineLearning02.mp4 ex2.ipynb
6.5.Class 3: Mixture models, clustering, and EM MachineLearning03.pdf MachineLearning03.mp4 ex3.ipynb
20.5.Class 4: Nonparametric methods MachineLearning04.pdf MachineLearning04.mp4 ex4.ipynb
10.6.Class 5: Regression MachineLearning05.pdf MachineLearning05.avi ex5.ipynb
17.6.Class 6: Gaussian processes MachineLearning06.pdf MachineLearning06.avi ex6.ipynb
24.6.Class 7: Classification MachineLearning07.pdf MachineLearning07.mp4 ex7.ipynb
1.7.Class 8: Support vector machines MachineLearning08.pdf MachineLearning08.mp4 ex8.ipynb
8.7. Class 9: Projection methods MachineLearning09.pdf MachineLearning09.mp4 ex9.ipynb
15.7. Class 10: Inference in graphical models MachineLearning10.pdf MachineLearning10.mp4 ex10.ipynb
22.7.Class 11: Sampling methods MachineLearning11.pdf

Additional information and some helpful hints regarding the exercises can be found here.