-
I created my pytorch-lightning DataModule this way (as in TorchIO + MONAI + PyTorch Lightning notebook):
But I get the following lightning warning about the returned
And if I add
So, I guess I should ignore the lightning warning on DataLoaders. I guess torchio does the job of loading and processing the patches in parallel, so the DataLoader do not need multiple workers and batches are ready to be consumed. Is this correct? So I can safely ignore the lightning warning with:
as advised here? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Ok I just found in the documentation:
So I have my answer, thanks! |
Beta Was this translation helpful? Give feedback.
Ok I just found in the documentation:
So I have my answer, thanks!