Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roberta_l24_zh_base 和RoBERTa-zh-Large,有什么区别吗,还有有keras调用的样例吗,谢谢大佬了 #52

Open
lx-rookie opened this issue Nov 14, 2019 · 3 comments

Comments

@lx-rookie
Copy link

Roberta_l24_zh_base 和RoBERTa-zh-Large,有什么区别吗,还有有keras调用的样例吗,谢谢大佬了

@brightmart
Copy link
Owner

base是24层,但是hidden size没有扩大; large的hidden size也相应的扩大了;
keras调用例子: https://github.com/bojone/bert4keras

@lx-rookie
Copy link
Author

好的,感谢

@excelsimon
Copy link

base是24层,但是hidden size没有扩大; large的hidden size也相应的扩大了;

意思是加载Roberta_l24_zh_base需要修改bert_config_large.json中的hidden size为768?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants