Skip to content

Latest commit

 

History

History
9 lines (5 loc) · 506 Bytes

README.md

File metadata and controls

9 lines (5 loc) · 506 Bytes

pretraining-BERT

This accompanies the blog post: sidsite.com/posts/bert-from-scratch

See the pretraining code in pretraining_BERT.ipynb.

Fine-tuning

It's possible to fine tune a model after pretraining, using the run_finetuning.sh script. Note that the parameters here are roughly based on Cramming, but I used different training parameters for two of the tasks. (TODO: add these.))