We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Roberta_l24_zh_base 和RoBERTa-zh-Large,有什么区别吗,还有有keras调用的样例吗,谢谢大佬了
The text was updated successfully, but these errors were encountered:
base是24层,但是hidden size没有扩大; large的hidden size也相应的扩大了; keras调用例子: https://github.com/bojone/bert4keras
Sorry, something went wrong.
好的,感谢
base是24层,但是hidden size没有扩大; large的hidden size也相应的扩大了;
意思是加载Roberta_l24_zh_base需要修改bert_config_large.json中的hidden size为768?
No branches or pull requests
Roberta_l24_zh_base 和RoBERTa-zh-Large,有什么区别吗,还有有keras调用的样例吗,谢谢大佬了
The text was updated successfully, but these errors were encountered: