Paper:https://arxiv.org/abs/2005.08100
Conformer 에서는 Decoder로 LSTM만 사용
=> Decoder 에 ASR-Transformer Decoder 추가
lite transformer with long-short range attention : https://arxiv.org/abs/2004.11886
specAugment :https://arxiv.org/abs/1904.08779
Multi-head attention with relative pos encoding :https://arxiv.org/abs/1901.02860
https://github.com/lucidrains/conformer
https://github.com/sooftware/conformer
clovacall: https://github.com/clovaai/ClovaCall
Ai_hub(1,000 hours)