List of material and reading schedule (updated weekly)
1 Introduction
1.1 Polynomial curve fitting
1.2 Probability theory
1.3 Model selection
1.4 curse of dimensionality
1.5 decision theory (behalve 1.5.3-1.5.5)
1.6 information theory
appendix e lagrange multipliers
2 Probability distributions
2.1 Binary variables
2.2 Multinomial variables
2.3 Gaussian distribution + appendix C
replace 2.3.3 by slides 143-145
(not 2.3.4-5)
2.3.6 until eq. 2.143
(not 2.3.7-9)
2.4 Exponential family (not)
3 Linear models for regression
3.1 Linear basis function models (not 3.1.3, 3.1.5)
3.2 Bias Variance decomposition
3.3 Bayesian linear regression (not 3.3.3)
3.4 Bayesian Model comparison
3.5 Evidence approximation (not 3.5.2, 3.5.3)
3.6 Limitations of Fixed basis functions (not)
4 Linear models for classification
4.1.7 Perceptrons
proof perceptron learning rule (handout perceptrons)
4.3.1-2 Probabilistic discriminative models
6 Kernel Methods
6.4.2 Gaussian processes
6.4.4 Automatic relevance determination (not exam material)
Application of GP with ARD on genetic data, slides 234-241 (not exam material)
Illustration of deep learning, slides 267-277 (not exam material)
Lecture 1: Bishop 1.1-1.2.0
Lecture 2: Bishop 1.2.1-1.2.6
Lecture 3: Bishop 1.3-1.6
Lecture 4: Bishop chapter 2
Lecture 5: Bishop chapter 3.1-3.3 bias variance decomp and Bayesian lin regression
Lecture 6: Bishop chapter 3.4-3.5 model comparison evidence framework, 4.1.7 perceptron
Lecture 7: Bishop chapter 4.3.1-3 logistic regression, chapter 6.4.2-4 gaussian processes