Implementation of DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
- Three ipynb files for the three datasets mentioned in the paper.
- BERT_RTE.py is to train RTE dataset, CustomInLawBert_RR.py, DistilBert_RR.py are to train rhetorical role predicition dataset.
- Json Files indicates the Rhetorical Role prediction dataset.