Test performance discrepancy investigation #7421
-
Hi! I've noticed something odd in the time reporting when running tests with same Vitest configurations in CI/CD job with docker-in-docker (dind) versus without it on same hardware. While the individual step times are longer with dind, the reported total duration is paradoxically shorter. Here are the measurements: Without docker-in-docker (Sum of steps ~87s)
With docker-in-docker (Sum of steps ~126s)
Interestingly, the reported total duration is ~50s faster when executing tests with docker-in-docker. This matches our CI/CD job completion times, so Vitest's total duration calculation appears accurate 👍 I understand that parallel execution of the individual steps won't necessarily sum to the total duration, but i'm puzzled by seeing consistently slower individual steps in dind but a significantly faster total runtime. I think this suggests there are additional/existing operations happening in the non-dind version that aren't captured as separate timing steps but still consume a lot of time – perhaps related to generating test artifacts like coverage reports or some process is left hanging that blocks the completion of the tests (both measurements included generating the same artefacts thought and there was no warnings about hanging processes). I was also thinking if the dind picks up higher parallelism implicitly from some host machine settings that could be different when running dind vs non-dind 🤔 I'd appreciate any guidance on how to get deeper insight into where Vitest is spending its time. Are there custom reporters, profiling options, or other tools you'd recommend for investigating this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Is there any difference in test run times when running with |
Beta Was this translation helpful? Give feedback.
Is there any difference in test run times when running with
--no-file-parallelism
? That disables parallelism compeltely.