Implementation of Character level NMT(Neural Machine Translation) using RNN and Attention Wrapper
-
Updated
Nov 7, 2017 - Python
Implementation of Character level NMT(Neural Machine Translation) using RNN and Attention Wrapper
Automatic Text Simplification
Korean English NMT(Neural Machine Translation) with Gluon
Introduction nmt-chatbot is the implementation of chatbot using NMT - Neural Machine Translation (seq2seq). Includes BPE/WPM-like tokenizator (own implementation). Main purpose of that project is to make an NMT chatbot, but it's fully compatible with NMT and still can be used for sentence translations between two languages.
Implementation of LSTM and GRU for the English-Vietnamese (IWSLT'15) dataset.
Neural Machine Translation using Attention Mechanism
This is the sequential Encoder-Decoder implementation of Neural Machine Translation using Keras
whole package for seq2seq nmt model, machine translation.
CoNLL 2018: Post-OCR Text Correction in Romanised Sanskrit
Neural Machine Translation with attention mechanism
Neural Machine Translation for RDFS reasoning: code and datasets for "Deep learning for noise-tolerant RDFS reasoning" http://www.semantic-web-journal.net/content/deep-learning-noise-tolerant-rdfs-reasoning-4
Using Google Colab, we develop a NMT, language translator. Here, we do NMT to translate from English to Vietnamese.
Implement the Google NMT model with good explanation
Individual assignment 1 for CS 11-731 Machine Translation course.
Assignment 2 for CS 11-731 Machine Translation course.
A chatbot implemented in TensorFlow based on the seq2seq model, with certain rules integrated.
French to English neural machine translation trained on multi30k dataset.
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
Add a description, image, and links to the nmt-model topic page so that developers can more easily learn about it.
To associate your repository with the nmt-model topic, visit your repo's landing page and select "manage topics."