Skip to content

An updated version of wmt17-scripts that uses the Nematus Transformer model

Notifications You must be signed in to change notification settings

EdinburghNLP/wmt17-transformer-scripts

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 

Repository files navigation

WMT17 TRANSFORMER SCRIPTS


This is a fork of https://github.com/EdinburghNLP/wmt17-scripts that demonstrates training of Nematus models with a Transformer architecture.

training/scripts/train.sh shows a configuration corresponding to training a Transformer-base model.

Scripts for preprocessing, validation, and evaluation, are also provided, and mirror the WMT17 setup of the University of Edinburgh (with minor tweaks, e.g. reducing the BPE vocabulary size).

REQUIREMENTS

The models use the following software:

Please set the appropriate paths in the 'training/vars' file.

USAGE INSTRUCTIONS

For training, follow the instructions in training/README.md

LICENSE

All scripts in this directory are distributed under MIT license.

About

An updated version of wmt17-scripts that uses the Nematus Transformer model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages