- arxiv
- ICAART
- ICONIP 2019
- BERT
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations[Code]
- RobBERT: a Dutch RoBERTa-based Language Model
- RoBERTa: A Robustly Optimized BERT Pretraining Approach
- PoWER-BERT: Accelerating BERT inference for Classification Tasks
- TinyBERT: Distilling BERT for Natural Language Understanding
- NIPS2019
- 2018