Limiting Memory Consumption #13073
thatcort
started this conversation in
Help: Best practices
Replies: 1 comment
-
Depending on the type of model, spacy is using either cupy or pytorch to allocate GPU memory, so you can control the memory usage through cupy or pytorch options: https://docs.cupy.dev/en/stable/user_guide/memory.html By default, the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, we're trying to use Spacy for sentence segmentation (and ideally NER at the same time), but running into GPU memory issues. It seems that the amount of memory used increases with the number of sentences in the documents. This is causing OOM errors and other tasks on the GPU to fail or not run. We're trying to parameterize this to work in a Ray cluster where we allocate a certain amount of memory to the sentence segmentation task in advance (we can't know how many sentences there will be before running!).
Is there a way to limit the amount of memory that Spacy will use on the GPU?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions