Skip to navigation | Skip to main content | Skip to footer
Menu
Menu

Current postgraduate taught students

COMP60431: Machine Learning (2007-2008)

This is an archived syllabus from 2007-2008

Machine Learning
Level: 6
Credit rating: 15
Pre-requisites: No Pre-requisites
Co-requisites: No Co-requisites
Lectures: 1 day per week (5 weeks)
Lecturers: Gavin Brown, Magnus Rattray
Course lecturers: Gavin Brown

Magnus Rattray

Additional staff: view all staff
Timetable
SemesterEventLocationDayTimeGroup
Sem 1 w1-5 Lecture 2.15 Fri 09:00 - 17:00 -
Assessment Breakdown
Exam: 33%
Coursework: 67%
Lab: 0%

Introduction

Machine Learning is concerned with how to automate learning from experience. This is typically accomplished by forming ?models? which to some extent describe or summarise experiences, embodied in our data, in a useful way. For example, speech recognition software requires examples of continuous speech and will often form a different model for each different user. In this course a variety of machine learning paradigms and algorithms will be introduced which are appropriate for learning from examples with discrete or continuous-valued attributes. The course has a fairly mathematical content although it is intended to be self-contained.

Aims

This course unit aims to introduce the main algorithms used in modern machine learning, to introduce the theoretical foundations of machine learning and to provide practical experience of applying machine learning techniques.

Learning Outcomes

A student completing this course unit should:
have knowledge and understanding of the principle algorithms used in modern machine learning, as outlined in the syllabus below (A)
have sufficient knowledge of information theory and probability theory to understand some basic theoretical results in machine learning (A)
be able to apply machine learning algorithm to real datasets, evaluate their performance and appreciate the practical issues involved (C)
be able to provide a clear and concise description and justification for the employed experimental procedures (D)

Assessment of Learning outcomes

Learning outcomes (1) and (2) are assessed by examination, learning outcomes (1), (3) and (4) are assessed by laboratory reports

Contribution to Programme Learning Outcomes

A1, A2, C1, D3, D4

Syllabus

Introduction to machine learning: Overview of different task types: classification, regression, clustering, control.

Supervised Machine Learning

Linear Regression/Classification ? perceptrons, discriminant functions,

Feature selection, entropy, mutual information/gain - decision trees, ID3 and extensions.

Performance assessment ? overfitting, generalisation, ROC analysis, comparing two algorithms (read Dietterich?s paper)

Non-linear regression/classification - Feed-forward neural networks, support vector machines.

Model Complexity ? regularisation, bias+variance,

Combining Models ? ensemble learning, mixtures of experts, boosting, bias+variance+covariance

Project

Write a research paper applying appropriate techniques on supplied datasets.

Unsupervised Machine Learning


Introduction to probabilistic modelling: Bayes' rule, maximum likelihood, Bayesian inference, latent variable models

Clustering: Gaussian mixtures, EM-algorithm, k-means

Dimensionality reduction and visualisation: PCA and non-linear extensions

Sequence learning: Markov chains, hidden Markov models and linear dynamical systems

Reading List

Core Text
Title: Pattern recognition and machine learning
Author: Bishop, Christopher M.
ISBN: 9780387310732
Publisher: Springer
Edition:
Year: 2006


Supplementary Text
Title: Bioinformatics: the machine learning approach (2nd edition)
Author: Baldi, Pierre and Soren Brunak
ISBN: 026202506X
Publisher: MIT Press
Edition: 2nd
Year: 2001
This text covers a number of machine learning applications in biology and provides a good introduction to hidden Markov models, neural networks learning algorithms and Bayesian inference.


Supplementary Text
Title: Machine learning
Author: Mitchell, Tom M.
ISBN: 0070428077
Publisher: McGraw-Hill
Edition:
Year: 2007


Supplementary Text
Title: Introduction to natural computation
Author: Ballard, Dana H.
ISBN: 0262522586
Publisher: MIT Press
Edition:
Year: 1999
Provides a different perspective, with emphasis on the computational aspects of learning algorithmsin relation to computational models of the brain. Also covers some material on control and hidden Markov models not discussed in Mitchell's book.


Supplementary Text
Title: Elements of statistical learning: data mining, inference and prediction
Author: Hastie, Trevor, Robert Tibshirani and Jerome Friedman
ISBN: 9780387952840
Publisher: Springer
Edition:
Year: 2001
An advanced textbook taking a statistical perspective.


Supplementary Text
Title: Neural networks for pattern recognition
Author: Bishop, Christopher M.
ISBN: 0198537642
Publisher: Clarendon Press
Edition:
Year: 1995
This is Bishops earlier book snd provides a good introduction to neural networks and related statistical methods.