-
-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BERT MLM Support #208
base: master
Are you sure you want to change the base?
BERT MLM Support #208
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a question?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work @Dat-Boi-Arjun . Would you also please submit your jupyter notebook for training in examples/local
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Description
@AlanAboudib this adds support for a BERT encoder and iterator, specifically for the Masked LM use case.
Affected Dependencies
Now requires HuggingFace Transformers library to be installed.
How has this been tested?
I used a structure very similar to the BPTT Example Notebook to verify that my encoder and iterator work in training a BERT model. I was able to get good test and validation scores for my trained model on the Wikitext-2 dataset.