PyTorchWrapper causes error on deserializing #8319
Labels
bug
Bugs and behaviour differing from documentation
feat / serialize
Feature: Serialization, saving and loading
feat / transformer
Feature: Transformer
🔮 thinc
spaCy's machine learning library Thinc
How to reproduce the behaviour
Add a PyTorchWrapper component to a language pipeline and then do this:
Motivating case is covered in #8291. This issue touches code in Thinc and spacy-transformers.
The issue is that the model at construction time has fewer layers than after
initialize
is called. When deserializing Thinc detects this as an issue and throws an error.Your Environment
The text was updated successfully, but these errors were encountered: