You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As an RBERT user, I'd like the tokenizer to be as fast as it can be, so that I don't have to wait for this step more than is absolutely necessary.
First thing to check: Does keras::text_tokenizer (and friends) do what we need? If so, we should be able to save_text_tokenizer() when the model is downloaded for #51.
The text was updated successfully, but these errors were encountered:
As an RBERT user, I'd like the tokenizer to be as fast as it can be, so that I don't have to wait for this step more than is absolutely necessary.
First thing to check: Does
keras::text_tokenizer
(and friends) do what we need? If so, we should be able tosave_text_tokenizer()
when the model is downloaded for #51.The text was updated successfully, but these errors were encountered: