This is a course on natural language processing.
-
Lecturer: Felipe Bravo-Marquez
-
Lectures: Tuesday 14:30 - 16:00, Thursday 14:30 - 16:00 (Lecture Room B04, Beauchef 851, Edificio Poniente)
-
Course Program (in Spanish)
The neural network-related topics of the course are taken from the book of Yoav Goldberg: Neural Network Methods for Natural Language Processing. The non-neural network topics (e.g., grammars, HMMS) are taken from the course of Michael Collins.
- Introduction to Natural Language Processing | (tex source file)
- Vector Space Model and Information Retrieval | (tex source file), video 1, video 2
- Language Models (slides by Michael Collins), notes, video 1, video 2, video 3, video 4
- Text Classification and Naive Bayes (slides by Dan Jurafsky), notes, video 1, video 2, video 3
- Linear Models | (tex source file), video 1, video 2, video 3, video 4
- Neural Networks | (tex source file)
- Word Vectors | (tex source file)
- Tagging, and Hidden Markov Models (slides by Michael Collins), notes, videos
- MEMMs and CRFs | (tex source file)
- Convolutional Neural Networks | (tex source file), video
- Recurrent Neural Networks | (tex source file), video 1, video 2, video 3
- Sequence to Sequence Models, Attention, and the Transformer | (tex source file), video 1, video 2, video 3
- Constituency Parsing slides 1, slides 2, slides 3, slides 4 (slides by Michael Collins), notes 1, notes 2, videos 1, videos 2, videos 3, videos 4
- Recursive Networks and Paragraph Vectors | (tex source file)
- Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
- Michael Collins' NLP notes.
- A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
- Natural Language Understanding with Distributed Representation by Kyunghyun Cho
- Natural Language Processing Book by Jacob Eisenstein
- CS224n: Natural Language Processing with Deep Learning, Stanford course
- NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
- NLTK book
- AllenNLP: Open source project for designing deep leaning-based NLP models
- Real World NLP Book: AllenNLP tutorials
- Attention is all you need explained
- The Illustrated Transformer: a very illustrative blog post about the Transformer
- ELMO explained
- BERT exaplained
- Better Language Models and Their Implications OpenAI Blog
- David Bamman NLP Slides @Berkley
- RNN effectiveness
- SuperGLUE: an benchmark of Natural Language Understanding Tasks
- decaNLP The Natural Language Decathlon: a benchmark for studying general NLP models that can perform a variety of complex, natural language tasks.
- Deep Learning in NLP: slides by Horacio Rodríguez
- Chatbot and Related Research Paper Notes with Images
- XLNet Explained
- PyTorch-Transformers: a library of state-of-the-art pre-trained models for Natural Language Processing (NLP)
- Ben Trevett's torchtext tutorials
- PLMpapers: a collection of papers about Pre-Trained Language Models
- The Illustrated GPT-2 (Visualizing Transformer Language Models)
- Linguistics, NLP, and Interdisciplinarity Or: Look at Your Data, by Emily M. Bender
- The State of NLP Literature: Part I, by Saif Mohammad
- From Word to Sense Embeddings:A Survey on Vector Representations of Meaning
- 10 ML & NLP Research Highlights of 2019 by Sebastian Ruder
- Towards a Conversational Agent that Can Chat About…Anything
- Dive into Deep Learning Book
- Stanza - A Python NLP Library for Many Human Languages
- Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
- Natural Language Processing MOOC videos by Michael Collins, 2013
- Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017
- CS224N: Natural Language Processing with Deep Learning | Winter 2019
- Computational Linguistics I by Jordan Boyd-Graber University of Maryland
- Visualizing and Understanding Recurrent Networks
- BERT Research Series by Chris McCormick
- Successes and Challenges in Neural Models for Speech and Language - Michael Collins