VI. Neural networks
Given by Ilya Kuzovkin
Brief summary: Neural networks as a toolbox for approximating complex functions. Generalised linear models and the conceptual design of a feed-forward network. Hidden layer as an adaptive and non-linear map to higher feature space. Sigmoid functions and radial-based functions as standard ways to build non-linear mapping. Backpropagation algorithm as an efficient gradient decent procedure. Higher-order methods for minimising the training error. Computer vision and invariance under shifts and rotations. Training methods for forcing this type of invariance.
Slides: PDF & additional set of slides with backpropagation
Video: Panopto (2016) UTTV (2015) UTTV (2014)
Literature:
- Bishop: Pattern Recognition and Machine Learning pages 225 - 272
Complementary exercises:
- Bishop: Pattern Recognition and Machine Learning pages 284 - 290
- Use neural networks for the classification and prediction for various datasets listed below and compare the results obtained in the earlier exercise sessions
- Build a translation invariant neural network for distinguishing numbers in Semeion Handwritten Digit Data Set
- First, use random small translations to increase the data set.
- Second, use tangent propagation method.
- Try two-class versus multi-class classification tasks.
Free implementations: