Introduction to Pattern Recognition (code: NB054B)
for students AI and computer science (and others who are interested)
In the field of (statistical) pattern recognition the aim is
to learn a computer (by examples) to recognize
patterns in data sets (e.g. input-output relations).
Real data is often noisy, and therefore
probabilistic methods are used. In this course we take
the Bayesian perspective, which will be the starting point for
a treatment of both classical methods (least mean squares methods, discriminant
analysis) and modern methods (neural networks, Bayesian learning).
This course aims to a principled treatment of pattern recognition. For
a good understanding of pattern recognition (as well as of many other
subjects in modern AI), a certain mathematical depth is necessary. In
this course, we will not avoid the mathematics. However, ample time
will be reserved to acquire the necessary mathematical knowledge and
skills.
Course information
- Semester:
the course is held every spring. 2011: 2nd semester, starting 3 February
Enrollment via KISS, blackboard, etc. should work now.
If not, please contact W. Wiegerinck (send an email) if you want
to follow the course.
- Time:Lectures(Hoorcollege): Thursday 8:45 - 10:30,
Practical (werkcollege)
Thursday 10:45-12:30.
(about) 14 times
- Place: Erasmus Building
See blackboard
Practical: same place
- Lecturers:
Wim Wiegerinck,
Bert Kappen, and
Tom Heskes
- Practicum assistant:
Tom Claassen
-
Credits:
- Students AI and computer science: 6 ECTS
-
Students physics and mathematics:
We don't provide the version of this course for physics students (NB054C) anymore.
Physics and mathematics students interested in machine learning or neural networks are suggested to follow the course
¨Neural Networks and Information Theory¨ in the bachelor and the course "Machine learning" in the master (contact Bert Kappen for more information)
-
Others:
please contact one of the teachers (Bert Kappen, Wim Wiegerinck, Tom Heskes) if you are interested to follow this course.
-
Aim: After the course, students
should be able to understand and apply
existing models and (learning) algorithms for
statistical pattern recognition, such as Gaussian models, mixture models, EM,
neural networks and the well known backprop
algorithm, and to motivate, formulate and derive their own ones.
-
Intended audience
This course is intended for
- Master students computer science (informatica)
(theme of Artificial Intelligence).
- Master students artificial intelligence/cognitive science
- others with an interest in the subject
and sufficient background (see below).
-
Prerequisites
-
Mathematics: basic understanding of differential and integral calculus,
linear algebra and probability theory at least at the level
of the course description of
mathematics (I and II) for
voor AI (but we will explain and recapitulate where necessary).
-
Programming language: there will be some computer exercises in which you have to do some programming. You can choose any language that you like. If you don't have programming experience, we would advise Matlab, it is very easy to learn.
-
Course material:
Pattern Recognition
and Machine Learning
Christopher M. Bishop
Springer (2006)
-
Course overview:
- Motivation and overview of statistical pattern recognition
- Bayes' rule for reasoning under uncertainty
- Probability density estimation, maximum likelihood and Bayesian inference
- Regression, classification
- Non linear models, such as the multi-layer perceptron
We will discuss subjects as e.g.
- Error functions and their relation with probabilistic modelling
- Learning, generalization, and overfitting
- Bayesian vs frequentists methods
In addition, we will treat the following mathematical subjects in detail:
- Elementary probability theory
- Linear algebra: symmetric matrices, eigenvectors and eigenvalues,
coordinate transformations.
- Functions of several variables:
differentiation, integration, and maximization
- Gaussian integral, computation of mean and variance of a Gaussian
- Taylor series: zeroth order, first order and second order approximations
- Minimization under constraints using Lagrange multipliers
Course scheme
We start February 3. Other information is to be provided in blackboard, or will be discussed during the course.
Additional course material (2010)
Sheets
(Work in progress - sheets may change during course ):
[pdf]
Web solutions !!
Solutions of all www exercises from the first four book chapters.
Available from the author's home page:
pdf
Handouts:
Integration with several variables: [ps],
[pdf]
Other material is to be provided on blackboard.