Skip to content

Repository for the course Large Language Models and Generative AI for NLP

Notifications You must be signed in to change notification settings

Helsinki-NLP/LLM-course-2024

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Large Language Models and Generative AI for NLP

THIS REPOSITORY WILL EVOLVE OVER THE DURATION OF THE COURSE. WE WILL ADD CONTENT AS WE GO.

Time: Fall 2024 / Period 2
Target group: Master's students
Teachers:

Prerequisites:

  • Python coding experience
  • Basics of machine Learning (e.g. Machine Learning for Linguists (LDA-T317, KIK-LG210))

Course Description

This hands-on course delves into the world of Large Language Models (LLMs) and their applications in Natural Language Processing (NLP). Students will gain understanding of how LLMs work, how to fine-tune them for specific tasks, and how to leverage their capabilities for various NLP applications. Through weekly lectures and coding labs, students will gain practical experience working with state-of-the-art LLMs and explore their potential to revolutionize the field of NLP.

Evaluation

The course will be evaluated based on the submission of a final report.

Students will need to submit a final report that covers all the labs:

What was done in each lab? What was the motivation behind your solutions? What did you learn? Challenges you encountered?

Syllabus

Week Dates Topic / Lecture Format Teacher
1 29/31.10. Introduction to Generative AI and Large Language Models (LLM) 90 min lecture and 90 min lab Aarne
2 05/07.11. Using LLMs and Prompting-based approaches 90 min lecture and 90 min coding lab Aarne
3 12/14.11. Evaluating LLMs 90 min lecture and 90 min coding lab Jussi
4 19/21.11. Fine-tuning LLMs 90 min lecture and 90 min coding lab Aarne
5 26/28.11. Retrieval Augmented Generation (RAG) 90 min lecture and 90 min coding lab Dmitry
6 03/05.12. Use cases and applications of LLMs 90 min lecture and 90 min coding lab Dmitry
7 10/12.12. Final report preparation Student work on their final report Aarne

Detailed Syllabus:

Week 1: Introduction to Generative AI and Large Language Models (LLM)

  • Overview of Generative AI and its applications in NLP
  • Introduction to Large Language Models (LLMs) and their architecture
  • Lab: Learn about tokenizers

Week 2: Using LLMs and Prompting-based approaches

  • Understanding prompt engineering and its importance in working with LLMs
  • Exploring different prompting techniques for various NLP tasks
  • Hands-on lab: Experimenting with different prompts and evaluating their effectiveness

Week 3: Evaluating LLMs

  • Understanding the challenges and metrics involved in evaluating LLMs
  • Exploring different evaluation frameworks and benchmarks
  • Hands-on lab: Evaluating LLMs using different metrics and benchmarks

Week 4: Fine-tuning LLMs

  • Understanding the concept of fine-tuning and its benefits
  • Exploring different fine-tuning techniques and strategies
  • Hands-on lab: Fine-tuning an LLM for a specific NLP task

Week 5: Retrieval Augmented Generation (RAG)

  • Understanding the concept of RAG and its advantages
  • Exploring different RAG architectures and techniques
  • Hands-on lab: Implementing a RAG system for a specific NLP task

Week 6: Use cases and applications of LLMs

  • Exploring various real-world applications of LLMs in NLP
  • Discussing the potential impact of LLMs on different industries
  • Hands-on lab: query tables and generate synthetic data

Week 7: Final report preparation

  • Students work on their final reports, showcasing their understanding of the labs and the concepts learned.

Group Project submission

  • Final reports are submitted by 31st December 2024

Note: This syllabus is subject to change at the discretion of the instructors. Any modifications will be communicated to the students in a timely manner.

About

Repository for the course Large Language Models and Generative AI for NLP

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •