Skip to navigation | Skip to main content | Skip to footer
Menu
Menu

COMP34812 Natural Language Understanding syllabus 2021-2022

COMP34812 Natural Language Understanding

Level 3
Credits: 10
Enrolled students: 98

Course leader: Riza Batista-Navarro


Additional staff: view all staff

Requisites

  • Pre-Requisite (Compulsory): COMP34711

Assessment methods

  • 50% Written exam
  • 50% Practical skills assessment
Timetable
SemesterEventLocationDayTimeGroup
Sem 2 w20-27,31-34 ASYNCHRONOUS Williamson G.47 Thu 09:00 - 10:00 -
Sem 2 w20-27,31,33-34 Lecture IT407 Mon 09:00 - 10:00 -
Sem 2 w23,25,27,32,34 Workshop IT407 Fri 10:00 - 11:00 -

Overview

Drawing from concepts covered in the prerequisite COMP34711: Natural Language Processing unit, this unit will enable students to look more deeply into how machines analyse and recognise meaning expressed in natural language. In this unit, students will gain hands-on experience in investigating solutions to a number of natural language understanding tasks. This will provide students with the know-how required to develop technologies for real-world applications enabling communication between humans and machines, which have become increasingly ubiquitous and indispensable.

Aims

The unit aims to:

- introduce students to the concepts and computational methods that enable machines to understand and interpret natural language

-  explain the various tasks that underpin natural language understanding, and provide an overview of the state-of-the-art solutions to these tasks as well as their real-world applications

Syllabus

  • Introduction to NLU; Task formulations and applications
  • Meaning representations: symbolic parsing and logical representations of sentences
  • Vector-based representations (contextualised embeddings)
  • Neural networks and neural language models
  • Evaluation of models
  • Sequence classification and textual entailment (and applications)
  • Sequence labelling (and applications)
  • Machine reading comprehension (and applications)
  • Sequence-to-sequence translation (and applications)
  • Limits and weaknesses of state-of-the-art approaches to NLU

Teaching methods

Asynchronous lectures (weekly)

Synchronous workshops (weekly)

Labs (fortnightly)

Feedback methods

Discussions and live coding during workshops (weekly)

Labs to support coursework (fortnightly)

Cohort-level feedback on exam

Study hours

  • Lectures (20 hours)
  • Practical classes & workshops (10 hours)

Employability skills

  • Analytical skills
  • Problem solving
  • Written communication

Learning outcomes

On successful completion of this unit, a student will be able to:

  • To discuss the formulation of different natural language understanding tasks as sequence processing tasks e.g., sequence classification, sequence-to-sequence translation and sequence labelling.
  • To differentiate between different types of parsing algorithms and apply them to natural language data to produce meaning representations.
  • To compare different approaches to tasks such as named entity recognition and sentiment analysis.
  • To relate natural language understanding tasks to applications such as question answering and conversational agents, among others.
  • To develop a solution to a natural language understanding task with application to a real-world problem

Reading list

TitleAuthorISBNPublisherYear
nullDaniel W. Otter, Julian R. Medina and Jugal K. Kalitanullnullnull
nullIvano Lauriola, Alberto Lavelli and Fabio Aiollinullnullnull
nullAmirsina Torfi, Rouzbeh A. Shirvani, Yaser Keneshloo, Nader Tavaf and Edward A. Foxnullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
Speech and Language Processing 15 Logical Representations of Sentence Meaning. Daniel Jurafsky & James H. Martin. Herman Melville, Moby Dicknullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
The handbook of computational linguistics and natural language processing null9781118448670Wiley-Blackwell2013.
How to Fine-Tune BERT for Text Classification?Chi Sun, Xipeng Qiu * , Yige Xu, Xuanjing Huangnullnullnull
HIERARCHICAL TRANSFORMERS FOR LONG DOCUMENT CLASSIFICATIONRaghavendra Pappagari , Piotr ˙ Zelasko, Jesús Villalba, Yishay Carmiel, and Najim Dehaknullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
Zero-Shot Relation Extraction via Reading ComprehensionOmer Levy † Minjoon Seo † Eunsol Choi † Luke Zettlemoyer † ‡ † Allennullnullnull
End-to-end Neural Coreference ResolutionKenton Lee † , Luheng He † , Mike Lewis ‡ , and Luke Zettlemoyer † * † Paul G. Allennullnullnull
BERT for Coreference Resolution: Baselines and AnalysisMandar Joshi † Omer Levy § Daniel S. Weld †ǫ Luke Zettlemoyer † § † Allennullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
Speech and Language Processing. Daniel Jurafsky & James H. Martin.nullnullnull
A Unified MRC Framework for Named Entity RecognitionXiaoya Li ♣ , Jingrong Feng ♣ , Yuxian Meng ♣ , Qinghong Han ♣ , Fei Wu ♠ and Jiwei Li ♣ ♣ Shannon.AInullnullnull
Speech and Language ProcessingDaniel Jurafsky & James H. Martinnullnullnull
nullWafaa S. El-Kassas, Cherif R. Salama, Ahmed A. Rafea, Hoda K. Mohamednullnullnull