"Value Error: bytes object is too large" when using to_disk on large model. #6875
Labels
feat / serialize
Feature: Serialization, saving and loading
feat / transformer
Feature: Transformer
v2
spaCy v2.x
Hi,
I'm attempting to initialize the gpt2-xl huggingface model in SpaCy using the following code provided in examples/init_model.py:
After downloading the model, the function nlp.to_disk(path) raises an exception:
The text was updated successfully, but these errors were encountered: