XI. Expectation maximisation algorithm
Given by Sven Laur
Brief summary: Informal derivation of the EM-algorithm and its convergence proof. Kullback-Leibler divergence. Bayesian and frequentistic interpretation. Robust Gaussian mixtures and their application in geodata analysis. Reconstruction of missing values with EM algorithm. Data augmentation algorithm as a generalisation of the EM algorithm. Imputation of missing data with data augmentation algorithm.
Slides: PDF
Additional materials: Cookbook for EM-algorithms
Video: UTTV(2016) UTTV(2015)
Literature:
- Bishop: Pattern Recognition and Machine Learning pages 450 - 455
- Tanner: Tools for Statistical Inference: Methods for the Exploration of Posterior Distributions and Likelihood Functions pages 90 - 100
Complementary exercises:
- Bishop: Pattern Recognition and Machine Learning pages 455 - 459
- Use EM and DA algorithms for imputation of missing data and compare results
- Use EM and DA algorithms for clustering of events or locations:
- Try to fit Gaussian mixture model to diamond resources.
- Try to fit Gaussian mixture model to petrol resources.
- Try to use more complex models to track the spread of swine flu.
Free implementations: