opus+bt-2021-04-14.zip dataset: opus+bt model: transformer-align source language(s): eng target language(s): rus model: transformer-align pre-processing: normalization + SentencePiece (spm32k,spm32k) download: opus+bt-2021-04-14.zip test set translations: opus+bt-2021-04-14.test.txt test set scores: opus+bt-2021-04-14.eval.txt Benchmarks testset BLEU chr-F #sent #words BP newstest2012.eng-rus 31.2 0.588 3003 64830 0.980 newstest2013.eng-rus 23.2 0.518 3000 58560 0.974 newstest2015-enru.eng-rus 29.1 0.581 2818 55915 1.000 newstest2016-enru.eng-rus 27.4 0.565 2998 62018 0.993 newstest2017-enru.eng-rus 30.8 0.592 3001 60255 0.998 newstest2018-enru.eng-rus 27.4 0.572 3000 61920 1.000 newstest2019-enru.eng-rus 27.1 0.540 1997 48153 0.927 Tatoeba-test.eng-rus 45.6 0.652 10000 66872 0.987 tico19-test.eng-rus 27.5 0.556 2100 55837 0.927