Welcome to this repository where you can find PyTorch tutorials! These tutorials can help anyone as they are very general and require no a priori knowledge of PyTorch nor deep learning. However they were originally designed for our students in INF265: Deep learning. INF265 is run by the University of Bergen (Norway) and is a part of the machine learning specialisation of the master program in informatics offered by the department of Informatics of the University of Bergen. I was fully responsible for the practical part of INF265: Deep learning, and you can see the entirety of the practical part of INF265 on my corresponding repository.
- PyTorch datasets and transforms
- The CIFAR Dataset 2. Loading the CIFAR dataset in Pytorch 3. Getting started with the CIFAR dataset 4. Plot images 5. Count how many samples there are for each class
- Transforms 3. Convert an image into a PyTorch-friendly object 4. Include the preprocessing step (transform operator) when loading the dataset 5. Normalizing the dataset
- Define, train and evaluate a basic Neural Network in Pytorch
- Loading data
- Loading CIFAR-10 (see previous tutorial)
- From CIFAR-10 to CIFAR-2
- Basic building blocks for neural networks in PyTorch
- The 'torch.nn' module and the 'torch.nn.Module' class
- Our network as a nn.Sequential object
- Pytorch notations and dimensions
- Inspecting a module object
- Training our model
- Training on CPU
- Training on GPU
- Loading data
- Define a custom deep Neural Network in Pytorch
- .Loading data, training and measuring accuracy (see previous tutorial)
- Define a simple custom neural network
- Naive (but totally ok) method
- Using the functional API
- Train our custom network (as any other model)
- Measuring accuracy (as any other model)
- Going deeper: defining blocks of layers
- Using nn.Sequential
- Using a subclass of nn.Module
- Computational graph
- MiniNet
- PyTorch's computational graph
- Weight and gradient values
- Updating weights
- Updating gradients
- Good to know
- Input and output shapes of convolutional neural network layers
- A specific example
- Loading data
- Defining a convolutional neural network
- Figuring out shapes when stacking layers
- Shape of the data
- 1st layer: Convolution, nn.Conv2d
- Intermediate layer: MaxPool, nn.MaxPool2d
- 2nd layer: Convolution again
- 2nd Intermediate layer: MaxPool, nn.MaxPool2d
- 3rd layer: Fully connected layer nn.Linear
- 4th layer: Fully connected layer nn.Linear
- A specific example