Skip to content

ignasineira/CC6205

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CC6205 - Natural Language Processing

This is a course on natural language processing.

Info

The neural network-related topics of the course are taken from the book of Yoav Goldberg: Neural Network Methods for Natural Language Processing. The non-neural network topics (e.g., grammars, HMMS) are taken from the course of Michael Collins.

Slides

  1. Introduction to Natural Language Processing | (tex source file)
  2. Vector Space Model and Information Retrieval | (tex source file), video 1, video 2
  3. Language Models (slides by Michael Collins), notes, video 1, video 2, video 3, video 4
  4. Text Classification and Naive Bayes (slides by Dan Jurafsky), notes, video 1, video 2, video 3
  5. Linear Models | (tex source file), video 1, video 2, video 3, video 4
  6. Neural Networks | (tex source file)
  7. Word Vectors | (tex source file)
  8. Tagging, and Hidden Markov Models (slides by Michael Collins), notes, videos
  9. MEMMs and CRFs | (tex source file)
  10. Convolutional Neural Networks | (tex source file), video
  11. Recurrent Neural Networks | (tex source file), video 1, video 2, video 3
  12. Sequence to Sequence Models, Attention, and the Transformer | (tex source file), video 1, video 2, video 3
  13. Constituency Parsing slides 1, slides 2, slides 3, slides 4 (slides by Michael Collins), notes 1, notes 2, videos 1, videos 2, videos 3, videos 4
  14. Recursive Networks and Paragraph Vectors | (tex source file)

Other Resources

  1. Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
  2. Michael Collins' NLP notes.
  3. A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
  4. Natural Language Understanding with Distributed Representation by Kyunghyun Cho
  5. Natural Language Processing Book by Jacob Eisenstein
  6. CS224n: Natural Language Processing with Deep Learning, Stanford course
  7. NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
  8. NLTK book
  9. AllenNLP: Open source project for designing deep leaning-based NLP models
  10. Real World NLP Book: AllenNLP tutorials
  11. Attention is all you need explained
  12. The Illustrated Transformer: a very illustrative blog post about the Transformer
  13. ELMO explained
  14. BERT exaplained
  15. Better Language Models and Their Implications OpenAI Blog
  16. David Bamman NLP Slides @Berkley
  17. RNN effectiveness
  18. SuperGLUE: an benchmark of Natural Language Understanding Tasks
  19. decaNLP The Natural Language Decathlon: a benchmark for studying general NLP models that can perform a variety of complex, natural language tasks.
  20. Deep Learning in NLP: slides by Horacio Rodríguez
  21. Chatbot and Related Research Paper Notes with Images
  22. XLNet Explained
  23. PyTorch-Transformers: a library of state-of-the-art pre-trained models for Natural Language Processing (NLP)
  24. Ben Trevett's torchtext tutorials
  25. PLMpapers: a collection of papers about Pre-Trained Language Models
  26. The Illustrated GPT-2 (Visualizing Transformer Language Models)
  27. Linguistics, NLP, and Interdisciplinarity Or: Look at Your Data, by Emily M. Bender
  28. The State of NLP Literature: Part I, by Saif Mohammad
  29. From Word to Sense Embeddings:A Survey on Vector Representations of Meaning
  30. 10 ML & NLP Research Highlights of 2019 by Sebastian Ruder
  31. Towards a Conversational Agent that Can Chat About…Anything
  32. Dive into Deep Learning Book
  33. Stanza - A Python NLP Library for Many Human Languages

Videos

  1. Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
  2. Natural Language Processing MOOC videos by Michael Collins, 2013
  3. Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017
  4. CS224N: Natural Language Processing with Deep Learning | Winter 2019
  5. Computational Linguistics I by Jordan Boyd-Graber University of Maryland
  6. Visualizing and Understanding Recurrent Networks
  7. BERT Research Series by Chris McCormick
  8. Successes and Challenges in Neural Models for Speech and Language - Michael Collins

About

Natural Language Processing

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 64.1%
  • TeX 34.5%
  • Other 1.4%