Home
Uni-Logo
 

Statistical Pattern Recognition

Prof. Thomas Brox

Statistical pattern recognition, also known as "machine learning", is a key element of modern computer science. Its goal is to find, learn, and recognize patterns in complex data, for example in images, speech, biological pathways, the internet. In contrast to classical computer science, where the computer program, the algorithm, is the key element of the process, in machine learning we have a learning algorithm, but in the end the actual information is not in the algorithm, but in the representation of the data processed by this algorithm.

This course gives an introduction in all tasks of machine learning: classification, regression, and clustering. In the case of classification, we learn a decision function from annotated training examples (e.g., a set of dog and non-dog images). Given a new image, the classifier should be able to tell whether it is a dog image or not. In regression we learn a mapping from an input function to an output function. Again this mapping is learned from a set of input/output pairs. Both classification and regression are supervised methods as the data comes together with the correct output. Clustering is an unsupervised learning method, where we are just given unlabeled data and where clustering should separate the data into reasonable subsets. The course is based in large parts on the textbook "Pattern Recognition and Machine Learning" by Christopher Bishop. The exercises will consist of theoretical assignments and programming assignments in Python.

This course is not offered in summer 2023. It will be offered again in summer 2024.

Lecture:
(2 SWS)
Wednesday, 14:15-15:45
101-01-009/013

Exercises:
(2 SWS)
Monday, 14:30-15:30
Hybrid meeting.
Physical: Room 101-00-010/014
Online: Zoom (Password: see Discussion forum)
Contact persons: Simon Ging, Philipp Schröppel
Discussion Forum

Beginning: Lecture: Wednesday, April 27, 2022
Exercises: Monday, May 2, 2022

ECTS Credits: 6

Recommended semester:

1 or 2 (MSc)
Requirements: Fundamental mathematical knowledge, particularly statistics.

Exam: The written exam is on 12.9.2023 13:00-14:00. It will consist of a mixture of binary choice questions and fields, in which you must fill your solution. To get an idea of the style of the exam you can have a look at the test exam for image processing and the test exam for optimization.

Remarks: Full completion of all relevant theoretical and programming assignments is highly recommended.

Relevance vector machines

Slides and Recordings

DateTopicSlidesRecordingsSolutions
27.4. Class 1: Introduction MachineLearning01.pdf MachineLearning01.mp4
4.5. Class 2: Probability distributions MachineLearning02.pdf MachineLearning02.mp4
11.5. Class 3: Mixture models, clustering, and EM MachineLearning03.pdf MachineLearning03.mp4
18.5. Class 4: Nonparametric methods MachineLearning04.pdf MachineLearning04.mp4
25.5. Class 5: Regression MachineLearning05.pdf MachineLearning05.mp4
1.6. Class 6: Gaussian processes MachineLearning06.pdf MachineLearning06.mp4
15.6. Class 7: Classification MachineLearning07.pdf MachineLearning07.mp4
22.6. Class 8: Support vector machines MachineLearning08.pdf MachineLearning08.mp4
29.6. Class 9: Projection methods MachineLearning09.pdf MachineLearning09.mp4
6.7. Class 10: Inference in graphical models MachineLearning10.pdf MachineLearning10.mp4
13.7. Class 11: Sampling methods MachineLearning11.pdf
MachineLearning11.mp4
27.7. Class 12: Q&A

Exercises

The exercise material is provided at a Github repository.

There is an Online Forum for announcements, questions, and discussions.