Skip to content

InsiderPants/Neural-Machine-Translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural-Machine-Translation

Neural Machine Translation by Seq2Seq Model with Attention layer

This is a sample neural machine translation project with converts english to french. Don't expect it to perform as good as google translate because it had been trained on very small dataset. You can train on big dataset and I'm sure it will perform good.

Overview

  • It is based on Seq2Seq Rnn model with attention layer.
  • Seq2Seq have two main parts, the encoder model and the decoder model.
  • The decoder model can be purely Seq2Seq or with a custom attention layer that improves the accuracy of model.

Dependencies

  • Python 3+
  • Keras with tensorflow backend
  • nvidia Gpu (for training purpose as it use CuDNNLSTM layer that is accelerated by CuDNN library by nvidia)
  • Numpy

How to use

  1. Fork this repo
  2. Download the dataset from here .
  3. Download the GloVe Word embeddings from here.
  4. Save both data and GloVe embeddings in data folder.
  5. If training, make changes in file utils/config.py if you want.
  6. Use the train.ipynb notebook for training.
  7. If using for test-predictions, download the weights from here and save it in weights folder.
  8. Use test-translations.ipynb notebook.

Referneces

  1. check out Deep NLP course by Lazy Programmer Inc.
  2. For actual code check his repo.
  3. Also check out this cool article about Seq2Seq model with attention layer.

About

Neural Machine Translation by Seq2Seq Model with Attention layer

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published