Home
Uni-Logo
 

Statistical Pattern Recognition

Prof. Thomas Brox

Statistical pattern recognition, nowadays often known under the term "machine learning", is the key element of modern computer science. Its goal is to find, learn, and recognize patterns in complex data, for example in images, speech, biological pathways, the internet. In all these cases we would like to relate abstract symbols, such as the word "dog", to patterns that cannot be described by a simple equation or a set of logical statements. Can you describe the symbol "dog" by just using the pixel values of an image? How do you deal with the variation among dog images? Can you recognize an unknown dog as a dog? What is the pattern that is common to all dogs? Statistical pattern recognition tries to answer these questions by looking at many examples, the more the better. This is very different from classical computer science where the computer program, the algorithm, is the key element of the process. In machine learning we have a learning algorithm, but the actual information is not in the algorithm but in the data processed and stored by this algorithm.

This course gives an introduction in the classical tasks of pattern recognition: classification, regression, and clustering. In the case of classification, we learn a decision function from annotated training examples (e.g., a set of dog images). Given a new image, the classifier should be able to tell whether it is a dog image or not. In regression we learn a mapping from an input function to an output function. Again this mapping is learned from a set of input/output pairs. Both classification and regression are supervised methods as the data comes together with the correct output. Clustering is an unsupervised learning method, where we are just given unlabeled data and where clustering should separate the data into reasonable subsets. The course is based in large parts on the textbook "Pattern Recognition and Machine Learning" written by Christopher Bishop. The exercises will consist of theoretical assignments and programming assignments in Matlab.

The content of this course is complementary to the Machine Learning course offered by Joschka Boedecker and Frank Hutter. It absolutely makes sense to attend both courses if you want to specialize in Machine Learning.

Lecture:
(2 SWS)
Thursday, 10am-12am (10-12 Uhr)
Building 101, Room SR 01-018

Exercises:
(2 SWS)
Monday, 4pm-6pm (16-18 Uhr)
Building 082 (Mensa), Pool 00-028
Contact persons: Eddy Ilg, MohammadReza Zolfaghari
DiscussionForum

Beginning: Lecture: Thursday, April 21, 2016
Exercises: Monday, April 25, 2016

ECTS Credits: 6

Recommended semester:

6 (BSc)
1 or 2 (MSc)
Requirements: Fundamental mathematical knowledge, particularly statistics.

Remarks: Full completion of all relevant theoretical and programming assignments is highly recommended.
Written exam on 7.3.2017 10:00-11:30 in 101-026. The exam will consist of a mixture of binary choice questions and fields, in which you must fill your solution. To get an idea of the style of the exam you can have a look at the test exam for image processing and the test exam for optimization.
You can have a look at your exam from March on 5.5.2017 15:50 in room 52-1-33.
Relevance vector machines

Slides and Recordings

DateTopicSlidesRecordingsMaterial
21.4. Class 1: Introduction MachineLearning01.pdf MachineLearning01.avi Exercise 1
28.4.Class 2: Probability distributions MachineLearning02.pdf MachineLearning02.avi Exercise 2
12.5. Class 3: Mixture models, clustering, and EM MachineLearning03.pdf MachineLearning03.mp4 Exercise 3
2.6.Class 4: Nonparametric methods MachineLearning04.pdf MachineLearning04.avi Exercise 4
9.6.Class 5: Regression MachineLearning05.pdf MachineLearning05.avi Exercise 5
16.6. Class 6: Gaussian processes MachineLearning06.pdf MachineLearning06.avi Exercise 6
23.6.Class 7: Classification MachineLearning07.pdf MachineLearning07.avi Exercise 7
30.6.Class 8: Support vector machines MachineLearning08.pdf MachineLearning08.mp4 Exercise 8
7.7. Class 9: Projection methods MachineLearning10.pdf MachineLearning10.mp4 Exercise 9
14.7. Class 10: Inference in graphical models MachineLearning11.pdf MachineLearning11.avi Exercise 10
21.7.Class 11: Sampling methods MachineLearning12.pdf
MachineLearning12.avi

Additional information and some helpful hints regarding the exercises can be found here.

Recordings from Olaf Ronneberger (2015):

Class 1: Introduction
Class 2: Probability distributions
Class 3: Mixture models, clustering, and EM
Class 4: Nonparametric methods
Class 5: Regression
Class 7: Classification
Class 8: Support vector machines
Class 9: Relevance vector machines
Class 10:Projection methods
Class 12:Sampling methods