Home
Uni-Logo
 

Statistical Pattern Recognition

Prof. Thomas Brox

Statistical pattern recognition, nowadays often known under the term "machine learning", is the key element of modern computer science. Its goal is to find, learn, and recognize patterns in complex data, for example in images, speech, biological pathways, the internet. In all these cases we would like to relate abstract symbols, such as the word "dog", to patterns that cannot be described by a simple equation or a set of logical statements. Can you describe the symbol "dog" by just using the pixel values of an image? How do you deal with the variation among dog images? Can you recognize an unknown dog as a dog? What is the pattern that is common to all dogs? Statistical pattern recognition tries to answer these questions by looking at many examples, the more the better. This is very different from classical computer science where the computer program, the algorithm, is the key element of the process. In machine learning we have a learning algorithm, but the actual information is not in the algorithm but in the data processed and stored by this algorithm.

This course gives an introduction in the classical tasks of pattern recognition: classification, regression, and clustering. In the case of classification, we learn a decision function from annotated training examples (e.g., a set of dog images). Given a new image, the classifier should be able to tell whether it is a dog image or not. In regression we learn a mapping from an input function to an output function. Again this mapping is learned from a set of input/output pairs. Both classification and regression are supervised methods as the data comes together with the correct output. Clustering is an unsupervised learning method, where we are just given unlabeled data and where clustering should separate the data into reasonable subsets. The course is based in large parts on the textbook "Pattern Recognition and Machine Learning" written by Christopher Bishop. The exercises will consist of theoretical assignments and programming assignments in Matlab.

The content of this course is complementary to the Machine Learning course offered by Joschka Boedecker and Frank Hutter. It absolutely makes sense to attend both courses if you want to specialize in Machine Learning.

Lecture:
(2 SWS)
Thursday, 10am-12am (10-12 Uhr)
Building 101, Room SR 01-016

Exercises:
(2 SWS)
Monday, 2pm-4pm (14-16 Uhr)
Building 082 (Mensa), Pool 00-029
Contact persons: Huizhong Zhou, Anton Böhm
--> DiscussionForum

Beginning: Lecture: Thursday, April 27, 2017
Exercises: Monday, May 8, 2017

ECTS Credits: 6

Recommended semester:

6 (BSc)
1 or 2 (MSc)
Requirements: Fundamental mathematical knowledge, particularly statistics.

Remarks: Full completion of all relevant theoretical and programming assignments is highly recommended.
There will be a written exam after the lecturing period.
Participants of the exam in March 2017 can have a look at their exam on 5.5.2017 15:50 in room 52-1-33.
Relevance vector machines

Slides and Recordings

DateTopicSlidesRecordingsMaterial
27.4. Class 1: Introduction MachineLearning01.pdf MachineLearning01.avi Exercise 1
4.5.Class 2: Probability distributions MachineLearning02.pdf MachineLearning02.avi Exercise 2
11.5. Class 3: Mixture models, clustering, and EM MachineLearning03.pdf MachineLearning03.mp4 Exercise 3
18.5.Class 4: Nonparametric methods MachineLearning04.pdf MachineLearning04.avi Exercise 4
1.6.Class 5: Regression MachineLearning05.pdf MachineLearning05.avi Exercise 5
22.6. Class 6: Gaussian processes MachineLearning06.pdf MachineLearning06.avi Exercise 6
29.6.Class 7: Classification MachineLearning07.pdf MachineLearning07.avi Exercise 7
6.7.Class 8: Support vector machines MachineLearning08.pdf MachineLearning08.mp4 Exercise 8
13.7. Class 9: Projection methods MachineLearning10.pdf MachineLearning10.mp4 Exercise 9
20.7. Class 10: Inference in graphical models MachineLearning11.pdf MachineLearning11.avi Exercise 10
27.7.Class 11: Sampling methods MachineLearning12.pdf
MachineLearning12.avi

Additional information and some helpful hints regarding the exercises can be found here.

Recordings from Olaf Ronneberger (2015):

Class 1: Introduction
Class 2: Probability distributions
Class 3: Mixture models, clustering, and EM
Class 4: Nonparametric methods
Class 5: Regression
Class 7: Classification
Class 8: Support vector machines
Class 9: Deep Learning
Class 10:Projection methods
Class 12:Sampling methods