Machine
Translation and Natural Language Generation
Vision: Making Communication Easier among All Languages & Making
Techniques Available for All Languages
Course Description
The course is designed for beginners in Natural Language Generation,
Natural Language Processing and Machine Translation. The main aim is to
exploring the theories and methods in automatically understanding and
generating natural language text, with special focus on
multilingualism.
The participation requires basic knowledge of machine learning.
Please consider taking online courses from Coursera, or watch online
videos.
The course is taught every Spring semaster in School of Computer
Science, Nanjing University, since 2020. It is firstly designed for
graduate students, and then opened for both graduate and undergraduate
students.
Objectives
- Understand the fundamental concepts, challenges and applications in
Natural Language Processing/Generation.
- Learn the evolution of research in Mutilingualism, including machine
tralsation, multiligual models, etc.
- Practice the design, training and application of natural language
generation models.
Outline
1. Introduction
- Problems in Natural Language Processing
- NLP as Classifications
- NLP as Structured Predictions
- Natural Language Generation
2. Language Models
- Probabilistic Modeling of Natural Language
- Statistical Language Models
- Neural Language Models and Pretraining
- Language Language Models (LLMs)
- LLMs and Reinforcement Learning
3. Machine Translation
- Traditional Machine Translation (Rule-based Machine Translation,
Statistical Machine Translation)
- Machine Translation and Deep Learning
- Machine Translation and LLMs
- *Machine Translation with Less Parallel Data (Low-resource,
Unsupervised Machine Translation)
- *Non-Autoregressive Machine Translation (Parallel Generation)
- *Interactive Machine Translation (Human-involved Generation)
- *Translation Quality Evaluation (Cross-task Knowledge
Transfering)
4. Multilingual LLMs
- Evaluation of Multilinguality
- Extending LLMs to New Languages
- Aligning Language Abilities among Different Languages
- Understanding Multilinguality in LLMs
5. Other Topics in LLMs
- Reasoning Abilities in LLMs
- Alignment and Safety in LLMs
- *LLMs for Biology (Generating Protains)
- *Multi-modal Large Langauge Models
6. Other Generation Tasks
- Summarization: Content Selection
- Paraphrase: Semantical Equivalence (Variational Auto-Encoders)
- Style Transfer: Controlled Generation (Generative Adversarial
Networks)
- Image Captioning: Multi-modal Interaction
Note: *marks topics that may change in different semasters.
Assignments and Assessment
- Assignment 1: interacting with Language Models
- Assignment 2: research on Large Language Models
- Assignment 3: research on Machine Translation and
Multilingualism
- Final Project: implementation, experiments and discussions on
self-selected topics
Assessment involves both written report and in-class presentation and
discussion.
- Shujian Huang (homepage)
- Email: huangsj at nju dot edu dot cn
History
MTNLG2024
Acknowledgement
The course is constantly improved with the help from wonderful
teaching assistants: Zaixiang Zheng(2020), Yu Bao(2021), Jiahuan
Li(2022), Wenhao Zhu(2023), Changjiang Gao(2024), Peng Ding(2025)