-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using system generated seed in RandomSampler #1441
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/data/1441
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit a5ec001 with merge base f15fd3a ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
seed = 1 | ||
torch.manual_seed(seed) | ||
dl3 = StatefulDataLoader(self.dataset, batch_size=1, shuffle=True) | ||
data_dl3 = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we call this results3
? and similarly results1 and results2 above.
|
||
seed = 1 | ||
torch.manual_seed(seed) | ||
dl3 = StatefulDataLoader(self.dataset, batch_size=1, shuffle=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can rename dl3
to dataloader3
. ditto for other dataloader variables.
) | ||
|
||
def test_seed_replicability(self): | ||
|
||
seed = 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of checking for specific seeds 0 and 1, we can generalize it to two randomly generated seeds.
And also add a assert to ensure both seeds are not equal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
data_source: Sized, | ||
replacement: bool = False, | ||
num_samples: Optional[int] = None, | ||
generator=None, | ||
): | ||
if generator is None: | ||
# Ensure that underlying sampler has something repeatable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's remove or update this comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update/remove comment and then gogogo
* add new sampler tests * update seed generation in sampler * run precommit * update seed generation * change variable name * update comment * add seed to tests * run precommit
* Fix end of epoch StatefulDataLoader restart (#1439) * add test for end of epoch state dict check * run precommit update stateful_dataloader run precommit local changes update test to test the order of batches update test update tests revert changes in SDL revert changes in SDL update tests run precommit * update sampler * run precommit * remove unnecessary comment * add test for statedict before and after endofepoch * run precommit * check if _sampler_iter is exhausted * run precommit * remove commented lines * remove default values * only exhaust sampler_iter if present in sd * update _StatefulRandomSamplerIterator update state dict if the iterator has finished add comment about why were updating state dict run precommit * update randomsampleriter state_dict fully * run precommit * fork torch.utils.data RandomSampler reverse changes to sdl.py generator to iterator run precommit update generator usage * update class name * run precommit * add a method to generate permutations * update return type * update next logic * add comment * update tests to include non stateful samplers * add comments * Using system generated seed in RandomSampler (#1441) * add new sampler tests * update seed generation in sampler * run precommit * update seed generation * change variable name * update comment * add seed to tests * run precommit
Currently we are fixing the seed for
generator
inRandomSampler
as 1.This leads to the generator not changing even when torch.manual_seed() seed is changed.
For the RandomSampler in
torch.utils.data.sampler
, they useseed = int(torch.empty((), dtype=torch.int64).random_().item())
. Using the same here.Fixes #1440