Skip to content

Commit

Permalink
"many batches" -> "mini-batches" (fastai#402)
Browse files Browse the repository at this point in the history
  • Loading branch information
kerrickstaley authored Apr 25, 2022
1 parent 2f010aa commit 150e224
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions 04_mnist_basics.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4180,7 +4180,7 @@
"\n",
"As we saw in our discussion of data augmentation in <<chapter_production>>, we get better generalization if we can vary things during training. One simple and effective thing we can vary is what data items we put in each mini-batch. Rather than simply enumerating our dataset in order for every epoch, instead what we normally do is randomly shuffle it on every epoch, before we create mini-batches. PyTorch and fastai provide a class that will do the shuffling and mini-batch collation for you, called `DataLoader`.\n",
"\n",
"A `DataLoader` can take any Python collection and turn it into an iterator over many batches, like so:"
"A `DataLoader` can take any Python collection and turn it into an iterator over mini-batches, like so:"
]
},
{
Expand Down Expand Up @@ -4239,7 +4239,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"When we pass a `Dataset` to a `DataLoader` we will get back many batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
"When we pass a `Dataset` to a `DataLoader` we will get back mini-batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
]
},
{
Expand Down

0 comments on commit 150e224

Please sign in to comment.