Skip to content

sftwre/Homemade-GPT

Repository files navigation

Homemade-GPT (Generative Pre-Trained Transformer)

This repository contains a Python implementation of the Decoder portion of the Tranformer architecture introduced in the seminal paper - Attention is all you need. Figure 1 shows the complete Transformer architecture with the Encoder block on the left and Decoder block on the right. img

The model is fine-tuned on the Alpaca instruction dataset using the Alpaca prompt style.

About

Python implementation of a GPT model

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published