Does Floret support Fasttext trained models? #11866
Replies: 2 comments 8 replies
-
floret can't currently load The actual differences are very small, just two added settings in the arguments, and it might be possible to make it backwards-compatible without a lot of effort? The direct changes are in But I think it might be hard to auto-detect the format. It might be easier to write a short program that loads a model as fasttext, adds the default values for these two fields, and writes it back out for floret? |
Beta Was this translation helpful? Give feedback.
-
Interestingly, we typically use quantized versions of Fasttext models in production so *.ftz. Would these also work? I noticed that the documentation suggests you can quantize Floret-based models? |
Beta Was this translation helpful? Give feedback.
-
The Floret documentation seems to suggest that it's compatible with Fasttext - i.e. train with
-mode fasttext
However, if I try to load a model trained with Fasttext, Floret hangs and spins the CPU to 100%. Does this mean that although Floret will return a fasttext-esque model, it's only compatible with Floret?
I ask as we have lots of existing Fasttext trained models in production and would love to adopt Floret, but need to be able to load existing models, rather than re-training everything. Obviously we can run Fasttext and Floret in parallel, but it's not ideal.
We're testing using the command line version to avoid any Python issues.
Beta Was this translation helpful? Give feedback.
All reactions