diff --git a/index.md b/index.md index dc60558..b8eeb10 100644 --- a/index.md +++ b/index.md @@ -22,7 +22,7 @@ I (still) [teach NLP](https://github.com/yandexdataschool/nlp_course) at the [Ya ## News -* 03-06/2021 Invited talks at: [Stanford NLP Seminar](https://nlp.stanford.edu/seminar/), CornellNLP, CambridgeNLP, ...TBU. +* 03-06/2021 Invited talks at: [Stanford NLP Seminar](https://nlp.stanford.edu/seminar/), CornellNLP, [MT@UPC](https://mt.cs.upc.edu/seminars/), CambridgeNLP, [DeeLIO workshop at NAACL 2021](https://sites.google.com/view/deelio-ws/), ...TBU. * 10-12/2020 Invited talks at: CMU, [USC ISI](https://nlg.isi.edu/nl-seminar/), ENS Paris, [ML Street Talk](https://www.youtube.com/watch?v=Q0kN_ZHHDQY). * 09/2020 __2__ papers _accepted to __EMNLP__ 2020_. * 06-08/2020 Invited talks at: MIT, DeepMind, [Grammarly AI](https://grammarly.ai/information-theoretic-probing-with-minimum-description-length/), Unbabel, [NLP with Friends](https://nlpwithfriends.com). diff --git a/nlp_course.html b/nlp_course.html index 5505fa6..0594f27 100644 --- a/nlp_course.html +++ b/nlp_course.html @@ -193,7 +193,7 @@

This new format of the course is designed for:

Seminars & Homeworks -
notebooks in our 6.8k-☆ course repo +
notebooks in our 6.9k-☆ course repo
@@ -256,7 +256,7 @@

Bonus:

Seminars & Homeworks

For each topic, you can take notebooks from - our 6.8k-☆ course repo. + our 6.9k-☆ course repo.

From 2020, both PyTorch and Tensorflow!

diff --git a/talks.md b/talks.md index 68ae410..33ca0a7 100644 --- a/talks.md +++ b/talks.md @@ -13,7 +13,7 @@ Program committee: * NAACL 2021 * ICML 2020 (top-33% reviewer) * NeurIPS 2020 -* ICLR 2021 +* ICLR 2021 (outstanding reviewer) * Workshops: [RepL4NLP](https://sites.google.com/view/repl4nlp2020/home) 2020 (at ACL), [DeeLIO](https://sites.google.com/view/deelio-ws/program-committee) 2020 (at EMNLP) @@ -34,6 +34,13 @@ Program committee:
      [Yandex Research Summit](https://yandex.com/promo/academy/yars_2019), Moscow, Russia (with David Talbot) ### Invited talks + +* 03-06/2021 _NMT Analysis: The Trade-Off Between Source and Target, and (a bit of) the Training Process_ Slides +
      [Stanford NLP Seminar](https://nlp.stanford.edu/seminar/) +
      Cornell NLP +
      [MT@UPC](https://mt.cs.upc.edu/seminars/) + + * 08-12/2020 _Evaluating Source and Target Contributions to NMT Predictions_ Slides
      Unbabel
      ENS Paris