Skip to navigation | Skip to main content | Skip to footer
Menu
Menu

COMP20121: The Implementation and Power of Computer Languages (2008-2009)

This is an archived syllabus from 2008-2009

The Implementation and Power of Computer Languages
Level: 2
Credit rating: 10
Pre-requisites: A programming language, preferably Java (e.g. COMP10092)
Co-requisites: No Co-requisites
Duration: 11 weeks in first semester
Lectures: 22 in total, 2 per week
Examples classes: 3 two-hour examples classes in weeks 4, 9 and 12.
Labs: 12 hours in total, 6 2-hour sessions, partly credited to COMP20910/COMP20920
Lecturers: Peter Aczel, Pete Jinks
Course lecturers: Peter Aczel

Richard Banach

Pete Jinks

Additional staff: view all staff
Timetable
SemesterEventLocationDayTimeGroup
Sem 1 w1-5,7-12 Lecture 1.3 Mon 11:00 - 12:00 -
Sem 1 w1-5,7-12 Lecture 1.3 Fri 11:00 - 12:00 -
Sem 1 w2-5,7-12 Lab UNIX Thu 11:00 - 13:00 F
Sem 1 w2-5,7-12 Lab UNIX Mon 14:00 - 16:00 G
Sem 1 w4,9,12 Examples LF17 Thu 11:00 - 13:00 F
Sem 1 w4,9,12 Examples LF17 Mon 14:00 - 16:00 G
Assessment Breakdown
Exam: 70%
Coursework: 10%
Lab: 20%

Aims

The first part of this course-unit is a practical introduction to the implementation of programming languages. It also gives some insights into the reasons behind some features of programming languages. The second part examines the underlying theory, giving an insight into the theoretical tools used in the first part. Having discussed languages on various levels of expressivity it goes on to show how the models underpinning the structure of languages lead to an understanding of the intrinsic limits of computation.

Learning Outcomes

A student completing this course unit should:

Understand the use of formal descriptions such as grammars to capture patterns in syntax, in particular the syntax of programming languages (A).
Understand the syntax-directed methodology for analysis of program text, as used by language processors such as compilers, and have some knowledge about the use of appropriate tools for this such as lex and yacc (A).
Understand different forms automata (deterministic and non-deterministic, with and without various memory devices) and their correspondence with formal languages on varying levels as well as their use in the implementation of grammar-driven tools such as lex and yacc (A).
Have some knowledge about different forms and levels of computation and appreciate that there are intrinsic limits to what computations can achieve such as the Halting problem (A).
Be able to manipulate abstract concepts and proofs and cope with abstraction, notations, and transformation between notations, such as that between automaton that accepts a language and a grammar for the language (B).
Be able to create a formal specification of a grammar from an informal description in English using the appropriate expressive power for the various parts (B,C).
Be able to design, write and debug programs in appropriate meta-languages for language-processing, and in particular to be able to use lex and yacc and their input languages (grammars) (C).
Have experience of the use of automata to recognize patterns in text (C).
Have experience of using Unix tools such as egrep and make (D).

Assessment of Learning outcomes

Learning outcomes 1 and 2 assessed by examination. Learning outcomes 3, 4 and 5 are assessed by examination and course work. Learning outcomes 6 and 7 are assessed by examination and lab. Learning outcomes 8, and 9 are assessed by lab.

Contribution to Programme Learning Outcomes

A1, A2, B1, B3, C5, D2, D5, D6

Syllabus

Introduction [2]: Overview. Patterns in texts. The pigeonhole principle.

Recognising patterns in texts [9]: egrep, lex and yacc. Parse trees. Top-down and Bottom-up parsing. How lex and yacc work. Ambiguous grammars. Debugging grammars. Case studies of designing grammars.

Regular languages and finite state automata [3]: Regular expressions, deterministic and non-deterministic finite automata, pumping lemma.

Context-free languages and pushdown automata [3]: Context-free grammars, pushdown automata, pumping lemma.

Turing-recognizable languages and Turing machines [2]: Turing machines, Turing-recognizable, Turing-decidable, the Chomsky hierarchy.

Computability [3]: Turing machines and computations, variants of Turing machines, Church's Thesis, the Halting problem, computability.

Reading List

The first book makes an attempt to unify the two halves of the course. The next three only cover the material taught in the theoretical (second) half of the course (and much more). Our recommendation is to look at the texts and consult the most appealing one - they all cover the necessary material. The last book is essentially a reference manual for lex and yacc - you probably won't need to use it, but if you do it is strongly recommended.

Core Text
Title: Lex and Yacc
Author: John R. Levine, Tony Mason, Doug Brown
ISBN: 1565920007
Publisher: O'Reilly Media, Inc, USA
Edition: 2nd
Year: 1992