Use torch.inference_mode()
for lower memory usage during calibration
#42
Job | Run time |
---|---|
4m 5s | |
2m 2s | |
1m 58s | |
2m 7s | |
10m 12s |
torch.inference_mode()
for lower memory usage during calibration
#42
Job | Run time |
---|---|
4m 5s | |
2m 2s | |
1m 58s | |
2m 7s | |
10m 12s |