Computational Semantics for Natural Language Processing

ETH Zürich, Spring Semester 2025: Course catalog


Course Description

This course presents an introduction to Natural language processing (NLP) with an emphasis on computational semantics i.e. the process of constructing and reasoning with meaning representations of natural language text.

The objective of the course is to learn about various topics in computational semantics and its importance in natural language processing methodology and research. Exercises and the project will be key parts of the course so the students will be able to gain hands-on experience with state-of-the-art techniques in the field.


Grading

The final assessment will be a combination of a group paper presentation (10%), two graded exercises (40%) and the project (50%). There will be no written exams.

Lectures: Fri 14:00-16:00 (CAB G61)

Discussion Sections: Fri 16:00-17:00

Office Hour (assignment, project): Please contact professor/TAs for appointment.

Textbooks: We will not follow any particular textbook. We will draw material from a number of research papers and classes taught around the world. However, the following textbooks would be useful:

  1. Introduction to Natural Language Processing by Jacob Eisenstein
  2. Speech and Language Processing by Jurafsky and Martin

News

21.02   Class website is online!


Course Schedule

 Lecture Date Description Course Materials Events            Exercise TA
  1  21.02     Introduction Diagnostic Quiz
Answers to quiz
Guidelines for Paper Presentation
Presentation preference indication  
  2  28.02  The Distributional Hypothesis and Word Vectors 1. Glove    
  Optional  28.02  Project (Group Formation)      
  3  07.03   Word Vectors 2, Word Senses and Sentence Vectors

(Recursive and Recurrent Neural Networks)
1. Unsupervised Word Sense Disambiguation Rivaling Supervised Methods
2. Improving Vector Space Word Representations Using Multilingual Correlation
3. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
   
 Voluntary  07.03   Projects (Introduction and Guidelines) 1. Guidelines
2. Suggested projects (TBU)
  Yifan
 4  14.03  NLU beyond a sentence

Seq2Seq and Attention

Case Study: Sentence Similarity, Textual Entailment and Machine Comprehension
1. Massive Exploration of Neural Machine Translation Architectures
2. Bidirectional Attention Flow for Machine Comprehension
   
 Voluntary  14.03  Project (Rotation) bring your project title;
find your supervised TAs
  All TAs
 5  21.03  Syntax and Predicate Argument Structures

(Semantic Role Labelling, Frame Semantics, etc.)
1. Stanford’s Graph-based Neural Dependency Parser at the CoNLL 2017 Shared Task
2. Grammar as a foreign language
Assignment 1 released  
 Voluntary  21.03  Project (Proposal Discussion 1) Project feasiblity, topic, and proposal summary
bring ideas (possibly even slides if needed) for discussion
  All TAs
 6  28.03  Predicate Argument Structures II

(Semantic Role Labelling, Frame Semantics, etc.)
1.Jointly Predicting Predicates and Arguments in Neural Semantic Role Labeling
2.Frame-Semantic Parsing
   
  Voluntary  28.03  Cluster usage Guidelines and QA session   Shehzaad
 7  04.04  Modelling and tracking entities: NER, coreference and information extraction (entity and relation extraction) 1. End-to-end Neural Coreference Resolution
2. Improving Coreference Resolution by Learning Entity-Level Distributed Representations
   
  Voluntary  04.04  Project (Proposal Discussion 2) Project feasiblity, topic, and proposal summary
bring ideas (possibly even slides if needed) for discussion
  All TAs
 8  11.04  Formal Representations of Language Meaning 1.Compositional semantic parsing on semi-structured tables
2.Supertagging With LSTMs
Project proposal due  
  Optional  11.04  QA Assignment 1
Project (if necessary)
  All TAs
 Easter  18.03         
 Easter  25.04         
 9  02.05  Transformers and Contextual Word Representations (BERT, etc.)
1. Big Bird: Transformers for Longer Sequences (Only cover the idea of sparse attention: don’t need to cover turing completeness and the theoretical results))
2. BERT rediscovers the classical NLP pipeline
Assignment 1 due
Assignment 2 release
Project proposal grade out
 
  Voluntary  02.05  Huggingface and Transformers 1. Huggingface   Sankalan
 10  10.05  Natural Language Generation

Case Study: Summarization and Conversation Modelling
1. Language Models are Unsupervised Multitask Learners
2. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
   
  Optional  09.05  Project (1-1 Discussion) Project progress, problems, whole storyline   All TAs
 11  16.05  Question Answering
1. Reading Wikipedia to Answer Open-Domain Questions
2. Latent Retrieval for Weakly Supervised Open Domain Question Answering
Assignment 1 grade out  
 Optional  16.05  Project (1-1 Discussion) Project progress, problems, whole storyline   All TAs
 12  23.05  Reasoning TBD Project mid-term report due  
 Optional  23.05  Project (1-1 Discussion) schedule meeting with your project TA if necesary   All TAs
 13  30.05  Language + {Knowledge, Vision, Action} 1. Knowledge Enhanced Contextual Word Representations
2. VisualBERT: A Simple and Performant Baseline for Vision and Language
Assignment 2 due  
 Optional  30.05  Project (1-1 Discussion) schedule meeting with your project TA if necesary   All TAs
   20.06      Assignment 2 grade out
Project report due
 
   11.07    Schedule, and link of the GatherTown Poster session  

Assignment Submission Instructions

Moodle

Materials


Contact

You can ask questions on moodle. Please post questions there, so others can see them and share in the discussion. If you have questions which are not of general interest, please don’t hesitate to contact us directly.

Lecturer Mrinmaya Sachan
Guest Lecturers Mubashara AkhtarYinya Huang
Teaching Assistants Shehzaad DhuliawalaYifan HouSankalan Pal Chowdhury,  Piyushi Goyal