-
Notifications
You must be signed in to change notification settings - Fork 412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Roberta in pytorch transformers #39
Comments
no. this roberta_zh is changed based on bert with main ideas from RoBERTa paper, but it may not compatible to roberta pytorch implementation from facebook ai project. so load from BertPreTrainedModel should be right way. can you paste code here to show how you load roberta_zh using BertPreTrainedModel, and how you use it. (anyway you can have a try, but it should be failed) |
It's the same with original
However, for
|
yeah, it is. |
can you please suggest which pytorch bert package is compatible with this pre-trained model? |
roberta is bert,you can load roberta use pytorch transformer, select model as bert |
I have been using
BertPreTrainedModel
to load this roberta model, which works well.Noticing in
pytorch_transformers
,Roberta
is also supported.Should I switch to
Roberta
? If so, what to use for the parametermerges_file
inRobertaTokenizer
?The text was updated successfully, but these errors were encountered: