COMP20121: The Implementation and Power of Computer Languages (2008-2009)
The first part of this course-unit is a practical introduction to the implementation of programming languages. It also gives some insights into the reasons behind some features of programming languages. The second part examines the underlying theory, giving an insight into the theoretical tools used in the first part. Having discussed languages on various levels of expressivity it goes on to show how the models underpinning the structure of languages lead to an understanding of the intrinsic limits of computation.
A student completing this course unit should:
Understand the use of formal descriptions such as grammars to capture patterns in syntax, in particular the syntax of programming languages (A).
Understand the syntax-directed methodology for analysis of program text, as used by language processors such as compilers, and have some knowledge about the use of appropriate tools for this such as lex and yacc (A).
Understand different forms automata (deterministic and non-deterministic, with and without various memory devices) and their correspondence with formal languages on varying levels as well as their use in the implementation of grammar-driven tools such as lex and yacc (A).
Have some knowledge about different forms and levels of computation and appreciate that there are intrinsic limits to what computations can achieve such as the Halting problem (A).
Be able to manipulate abstract concepts and proofs and cope with abstraction, notations, and transformation between notations, such as that between automaton that accepts a language and a grammar for the language (B).
Be able to create a formal specification of a grammar from an informal description in English using the appropriate expressive power for the various parts (B,C).
Be able to design, write and debug programs in appropriate meta-languages for language-processing, and in particular to be able to use lex and yacc and their input languages (grammars) (C).
Have experience of the use of automata to recognize patterns in text (C).
Have experience of using Unix tools such as egrep and make (D).
Assessment of Learning outcomesLearning outcomes 1 and 2 assessed by examination. Learning outcomes 3, 4 and 5 are assessed by examination and course work. Learning outcomes 6 and 7 are assessed by examination and lab. Learning outcomes 8, and 9 are assessed by lab.
Contribution to Programme Learning OutcomesA1, A2, B1, B3, C5, D2, D5, D6
Introduction : Overview. Patterns in texts. The pigeonhole principle.
Recognising patterns in texts : egrep, lex and yacc. Parse trees. Top-down and Bottom-up parsing. How lex and yacc work. Ambiguous grammars. Debugging grammars. Case studies of designing grammars.
Regular languages and finite state automata : Regular expressions, deterministic and non-deterministic finite automata, pumping lemma.
Context-free languages and pushdown automata : Context-free grammars, pushdown automata, pumping lemma.
Turing-recognizable languages and Turing machines : Turing machines, Turing-recognizable, Turing-decidable, the Chomsky hierarchy.
Computability : Turing machines and computations, variants of Turing machines, Church's Thesis, the Halting problem, computability.
The first book makes an attempt to unify the two halves of the course. The next three only cover the material taught in the theoretical (second) half of the course (and much more). Our recommendation is to look at the texts and consult the most appealing one - they all cover the necessary material. The last book is essentially a reference manual for lex and yacc - you probably won't need to use it, but if you do it is strongly recommended.
Core TextTitle: Lex and Yacc
Author: John R. Levine, Tony Mason, Doug Brown
Publisher: O'Reilly Media, Inc, USA