Skip to navigation | Skip to main content | Skip to footer

COMP20411: Machine Learning (2009-2010)

This is an archived syllabus from 2009-2010

Machine Learning
Level: 2
Credit rating: 10
Pre-requisites: COMP10020 or equivalent e.g. (MATH10662 and MATH10672 or MATH10111 and MATH10131 and MATH10212)
Co-requisites: COMP10412
Duration: 11 weeks in first semester
Lectures: 22 in total, 2 per week
Labs: 10 hours in total, 5 2-hour sessions, partly credited to COMP20910/COMP20920
Lecturers: Gavin Brown, Ke Chen
Course lecturers: Gavin Brown

Ke Chen

Additional staff: view all staff
Sem 1 w1-5,7-12 Lecture 1.1 Tue 13:00 - 15:00 -
Sem 1 w1,3,5,8,10,12 Lab G23 Wed 09:00 - 11:00 H
Sem 1 w1,3,5,8,10,12 Lab G23 Thu 09:00 - 11:00 F
Sem 1 w1,3,5,8,10,12 Lab UNIX Fri 09:00 - 11:00 G
Sem 1 w1,3,5,8,10,12 Lab G23 Fri 11:00 - 13:00 I
Assessment Breakdown
Exam: 60%
Coursework: 0%
Lab: 40%
Degrees for which this unit is optional
  • Artificial Intelligence BSc (Hons)


To introduce methods for extracting rules or learning from data, and provide the necessary mathematical background to enable students to understand how the methods work and how to get the best performance from them. This course covers basics of both supervised and unsupervised learning paradigms and is pitched towards any student with a mathematical or scientific background who is interested in adaptive techniques for learning from data as well as data analysis and modelling.

Learning Outcomes

Upon completion of the course, the student should:

Evaluate whether a learning system is appropriate for a particular problem. (1)
Understand how to use data for learning, model selection, and testing.(2)
Understand generally the relationship between model complexity and model performance, and be able to use this to design a strategy to improve an existing system.(3)
Understand the advantages and disadvantages of the learning systems studied in the course, and decide which is appropriate for a particular application.(4)
Make a naive Bayes classifier and interprete the results as probabilities. (5)
Be able to apply clustering algorithms to simple data sets for clustering analysis.(6)

Assessment of Learning outcomes

By examination: Learning outcomes 1,2,3,4,5,6 by laboratory: Learning outcomes: 2,3,5,6
Examination: 60%, Laboratory Project: 40%


Introduction to Machine Learning
K Nearest Neighbour Classifier
Decision Trees
Model Selection and Empirical Methodologies
Linear Classifiers: Perceptron and SVM
Na?ve Bayes Classifier
Basics of Clustering Analysis
K-mean Clustering Algorithm
Hierarchical Clustering Algorithm

Reading List

Core Text
Title: Introduction to machine learning (3rd edition)
Author: Alpaydin, Ethem
ISBN: 9780262028189
Publisher: MIT Press
Edition: 3rd
Year: 2014

Supplementary Text
Title: Artificial Intelligence: a modern approach (2nd edition)
Author: Russell, S. and P. Norvig
ISBN: 0130803022
Publisher: Prentice Hall
Edition: 2nd
Year: 2003

Supplementary Text
Title: Pattern recognition (4th edition)
Author: Theodoridis, Sergios and Konstantonis Koutroumbas
ISBN: 9781597492720
Publisher: Elsevier
Edition: 4th
Year: 2009