-
Notifications
You must be signed in to change notification settings - Fork 24
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
269 changed files
with
71,668 additions
and
3,315 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
# exclude data directory | ||
* | ||
!misc | ||
!param | ||
!src/* | ||
src/*__pycache__ | ||
!sandbox | ||
sandbox/*__pycache__ | ||
!paper | ||
paper/*__pycache__ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
[flake8] | ||
max-line-length = 79 | ||
... | ||
select = C,E,F,W,B,B950 | ||
ignore = E203, E501, W503 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
exclude: sandbox/.*|paper/.*|src/torchprune/torchprune/util/models/cnn/.*|src/torchprune/torchprune/method/messi/util/.* | ||
repos: | ||
- repo: https://github.com/psf/black | ||
rev: 20.8b1 | ||
hooks: | ||
- id: black | ||
args: ["--line-length", "79"] | ||
- repo: https://gitlab.com/PyCQA/flake8 | ||
rev: 3.8.4 | ||
hooks: | ||
- id: flake8 | ||
- repo: https://github.com/PyCQA/pydocstyle | ||
rev: 5.1.1 | ||
hooks: | ||
- id: pydocstyle | ||
- repo: https://github.com/PyCQA/pylint | ||
rev: pylint-2.6.0 | ||
hooks: | ||
- id: pylint | ||
args: ["--disable=C0330,C0302,E0401,R,W", "--good-names=to,i,j,k,g,x"] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,137 +1,137 @@ | ||
# Provable Pruning of Neural Networks using Sensitivity | ||
[Lucas Liebenwein*](http://www.mit.edu/~lucasl/), | ||
[Cenk Baykal*](http://www.mit.edu/~baykal/), | ||
# Neural Network Pruning | ||
[Lucas Liebenwein](http://www.mit.edu/~lucasl/), | ||
[Cenk Baykal](http://www.mit.edu/~baykal/), | ||
[Igor Gilitschenski](https://www.gilitschenski.org/igor/), | ||
[Harry Lang](https://www.csail.mit.edu/person/harry-lang), | ||
[Dan Feldman](http://people.csail.mit.edu/dannyf/), | ||
[Daniela Rus](http://danielarus.csail.mit.edu/) | ||
|
||
Implementation of provable pruning using sensitivity as introduced in [SiPPing | ||
Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks](https://arxiv.org/abs/1910.05422) | ||
(weight pruning) and [Provable Filter Pruning for Efficient Neural | ||
Networks](https://arxiv.org/abs/1911.07412) (filter pruning). These algorithms | ||
rely on a notion of sensitivity (the product of the data and the weight) to | ||
provably quantify the error introduced by pruning. This repo contains a | ||
stand-alone package (`./src/provable_pruning`) that can be used to prune your | ||
custom neural network in PyTorch as well as an experiment package | ||
(`./src/experiment`) to fully reproduce the experiments from our paper. | ||
|
||
***Equal contribution** | ||
|
||
## Methods | ||
|
||
### [SiPPing Neural Networks](https://arxiv.org/abs/1910.05422) (Weight Pruning) | ||
<p align="center"> | ||
<img src="./misc/imgs/sipp.png" width="100%"> | ||
</p> | ||
|
||
### [Provable Filter Pruning](https://arxiv.org/abs/1911.07412) (Filter Pruning) | ||
<p align="center"> | ||
<img src="./misc/imgs/pfp.png" width="100%"> | ||
</p> | ||
|
||
### Sensitivity of a weight | ||
These algorithm rely on a novel notion of weight sensitivity as saliency score | ||
for weight parameters in the network to estimate their relative importance. | ||
In the simple case of a linear layer the sensitivity of a single weight `w_ij` | ||
in layer `l` can be defined as the maximum relative contribution of the weight | ||
to the corresponding output neuron over a small set of points `x \in S`: | ||
|
||
<p align="center"> | ||
<img src="./misc/imgs/sensitivity.png" width="30%"> | ||
<img src="./misc/imgs/pruning_pipeline.png" width="100%"> | ||
</p> | ||
|
||
The weight hereby represents the edge connecting neuron `j` in layer `l-1` to | ||
neuron `i` in layer `l`. This notion can then be generalized to convolutional | ||
layers, neurons, and filters among others as is shown in the respective papers. | ||
|
||
In the papers, we show how pruning according to (empirical) sensitivity | ||
enables us to provably quantify the trade-off between the error and sparsity of | ||
the resulting pruned neural network. | ||
|
||
### Pruning in practice | ||
In practice, the prune pipeline follows an iterative procedure between | ||
pruning and retraining. | ||
<p align="center"> | ||
<img src="./misc/imgs/pruning_pipeline.jpeg" width="100%"> | ||
</p> | ||
### Papers | ||
This repository contains code to reproduce the results from the following | ||
papers: | ||
| Paper | Venue | Title & Link | | ||
| :---: | :---: | :--- | | ||
| **Lost** | MLSys 2021 | [Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy](https://proceedings.mlsys.org/paper/2021/hash/2a79ea27c279e471f4d180b08d62b00a-Abstract.html) | | ||
| **PFP** | ICLR 2020 | [Provable Filter Pruning for Efficient Neural Networks](https://openreview.net/forum?id=BJxkOlSYDH) | | ||
| **SiPP** | arXiv | [SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks](https://arxiv.org/abs/1910.05422) | | ||
|
||
### Packages | ||
In addition, the repo also contains two stand-alone python packages that | ||
can be used for any desired pruning experiment: | ||
| Packages | Location | Description | | ||
| :---: | :---: | :--- | | ||
|`torchprune` | [./src/torchprune](./src/torchprune) | This package can be used to run any of the implemented pruning algorithms. It also contains utilities to use pre-defined networks (or use your own network) and utilities for standard datasets. | | ||
| `experiment` | [./src/experiment](./src/experiment) | This package can be used to run pruning experiments and compare multiple pruning methods for different prune ratios. Each experiment is configured using a `.yaml`-configuration files. | | ||
|
||
### Paper Reproducibility | ||
The code for each paper is implemented in the respective packages. In addition, | ||
for each paper we have a separate folder that contains additional information | ||
about the paper and scripts and parameter configuration to reproduce the exact | ||
results from the paper. | ||
| Paper | Location | | ||
| :---: | :---: | | ||
| **Lost** | [paper/lost](./paper/lost) | | ||
| **PFP** | [paper/pfp](./paper/pfp) | | ||
| **SiPP** | [paper/sipp](./paper/sipp) | | ||
|
||
## Setup | ||
We provide three ways to install the codebase: | ||
1. [Github repo + full conda | ||
environment](./README.md#1.-github-repo) | ||
2. [Installation via pip](./README.md#2.-pip-installation) | ||
3. [Docker image](./README.md#3.-docker-image) | ||
|
||
### 1. Github Repo | ||
Clone the github repo: | ||
```bash | ||
git pull git@github.com:lucaslie/torchprune.git | ||
# (or your favorite way to pull a repo) | ||
``` | ||
|
||
We recommend installing the packages in a separate [conda | ||
environment](https://docs.conda.io/projects/conda/en/latest/user-guide/getting-started.html#managing-python). | ||
To create a new conda environment run | ||
```sh | ||
conda create -n pp | ||
conda activate pp | ||
conda install pip | ||
Then to create a new conda environment run | ||
```bash | ||
conda create -n prune python=3.8 pip | ||
conda activate prune | ||
``` | ||
To install the `provable_pruning` package containing the pytorch implementation | ||
of our pruning methods as well as all comparison methods, run | ||
```sh | ||
pip install -e ./src/provable_pruning | ||
To install all required dependencies and both packages, run: | ||
```bash | ||
pip install -r misc/requirements.txt | ||
``` | ||
**Note** that if you simply want to use our compression method in your own | ||
code/experiments it is sufficient to install the above package. | ||
|
||
To install the `experiment` package containing the code required to reproduce | ||
our experiments, run | ||
```sh | ||
pip install -e ./src/experiment | ||
Note that this will also install pre-commit hooks for clean commits :-) | ||
### 2. Pip Installation | ||
To separately install each package with minimal dependencies without | ||
cloning the repo manually, run the following commands: | ||
```bash | ||
# "torchprune" package | ||
pip install git+https://github.com/lucaslie/torchprune/#subdirectory=src/torchprune | ||
|
||
# "experiment" package | ||
pip install git+https://github.com/lucaslie/torchprune/#subdirectory=src/experiment | ||
``` | ||
|
||
## Run experiments | ||
You can find all experiment configurations as outlined in the paper under | ||
`src/experiment/experiment/param`. You can run any of the configurations by | ||
providing the relative path from `param`. For example, to run `Resnet20` on | ||
`CIFAR10` run | ||
```sh | ||
python -m experiment.main cifar/resnet20.yaml | ||
Note that the [experiment](./src/experiment) package does not automatically | ||
install the [torchprune](./src/torchprune) package. | ||
### 3. Docker Image | ||
You can simply pull the docker image from our docker hub: | ||
```bash | ||
docker pull liebenwein/torchprune | ||
``` | ||
|
||
### ImageNet experiments | ||
To reproduce the ImageNet experiments, you first need to download the dataset | ||
from [here](http://image-net.org/download). Place the file into | ||
`./data/training/imagenet_object_localization.tar.gz`. The rest is handled | ||
automatically. | ||
|
||
To download the pre-trained ImageNet networks (and reduce training time), run | ||
```sh | ||
python -m experiment.util.download_imagenet_weights | ||
You can run it interactively with | ||
```bash | ||
docker run -it liebenwein/torchprune bash | ||
``` | ||
|
||
### Logging | ||
Experiment progress is logged using tensorboard. To see the current progress, | ||
simply start tensorboard from the log directory | ||
```sh | ||
tensorboard --logdir=./data/results | ||
``` | ||
and follow the instructions to visualize the data. | ||
For your reference you can find the Dockerfile [here](./misc/Dockerfile). | ||
|
||
## More Information and Usage | ||
Check out the following `README`s in the sub-directories to find out more about | ||
using the codebase. | ||
|
||
### Results | ||
At the end of the run, plots (`.pdf`) and the raw numpy data (`.npz`) are | ||
stored under `./data/results`. | ||
| READMEs | More Information | ||
| --- | --- | | ||
| [src/torchprune/README.md](./src/torchprune) | more details to prune neural networks, how to use and setup the data sets, how to implement custom pruning methods, and how to add your data sets and networks. | | ||
| [src/experiment/README.md](./src/experiment) | more details on how to configure and run your own experiments, and more information on how to re-produce the results. | | ||
| [paper/lost/README.md](./paper/lost) | check out for more information on the [Lost](https://openreview.net/forum?id=BJxkOlSYDH) paper. | | ||
| [paper/pfp/README.md](./paper/pfp) | check out for more information on the [PFP](https://openreview.net/forum?id=BJxkOlSYDH) paper. | | ||
| [paper/sipp/README.md](./paper/sipp) | check out for more information on the [SiPP](https://arxiv.org/abs/1910.05422) paper. | | ||
|
||
## Citations | ||
Please cite the following papers for weight and filter pruning, respectively, | ||
when using our work. | ||
Please cite the respective papers when using our work. | ||
|
||
### [SiPPing Neural Networks](https://arxiv.org/abs/1910.05422) (Weight Pruning) | ||
### [Lost In Pruning](https://openreview.net/forum?id=BJxkOlSYDH) (Pruning Study) | ||
``` | ||
@article{baykal2019sipping, | ||
title={SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks}, | ||
author={Baykal, Cenk and Liebenwein, Lucas and Gilitschenski, Igor and Feldman, Dan and Rus, Daniela}, | ||
journal={arXiv preprint arXiv:1910.05422}, | ||
year={2019} | ||
@article{liebenwein2021lost, | ||
title={Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy}, | ||
author={Liebenwein, Lucas and Baykal, Cenk and Carter, Brandon and Gifford, David and Rus, Daniela}, | ||
journal={Proceedings of Machine Learning and Systems}, | ||
volume={3}, | ||
year={2021} | ||
} | ||
``` | ||
### [Provable Filter Pruning](https://arxiv.org/abs/1911.07412) (Filter Pruning) | ||
|
||
### [Provable Filter Pruning](https://openreview.net/forum?id=BJxkOlSYDH) (Filter Pruning) | ||
``` | ||
@inproceedings{ | ||
liebenwein2020provable, | ||
@inproceedings{liebenwein2020provable, | ||
title={Provable Filter Pruning for Efficient Neural Networks}, | ||
author={Lucas Liebenwein and Cenk Baykal and Harry Lang and Dan Feldman and Daniela Rus}, | ||
booktitle={International Conference on Learning Representations}, | ||
year={2020}, | ||
url={https://openreview.net/forum?id=BJxkOlSYDH} | ||
} | ||
``` | ||
|
||
### [SiPPing Neural Networks](https://arxiv.org/abs/1910.05422) (Weight Pruning) | ||
``` | ||
@article{baykal2019sipping, | ||
title={SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks}, | ||
author={Baykal, Cenk and Liebenwein, Lucas and Gilitschenski, Igor and Feldman, Dan and Rus, Daniela}, | ||
journal={arXiv preprint arXiv:1910.05422}, | ||
year={2019} | ||
} | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,58 @@ | ||
# Base Image is Ubuntu 18.04 with CUDA 11.0 | ||
FROM nvidia/cuda:11.0-devel-ubuntu18.04 | ||
|
||
# Make bash the default shell | ||
SHELL ["/bin/bash", "-c"] | ||
|
||
# Install a bunch of useful packages and miniconda | ||
RUN apt-get update && \ | ||
apt-get upgrade -y && \ | ||
apt-get install -y g++ wget libxrender1 vim git && \ | ||
wget --quiet -O ~/miniconda.sh \ | ||
https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh && \ | ||
chmod +x ~/miniconda.sh && \ | ||
~/miniconda.sh -b -p /opt/conda && \ | ||
rm ~/miniconda.sh | ||
|
||
# add conda to environment path variable | ||
ENV PATH /opt/conda/bin:$PATH | ||
|
||
# create a custom bashrc that is run even for non-interactive shells | ||
# also explicitly source from all standard bash config files | ||
RUN touch /etc/bashrc_custom && \ | ||
conda init bash && \ | ||
awk '/# >>> conda initialize >>>/,/# <<< conda initialize <<</' ~/.bashrc \ | ||
>> /etc/bashrc_custom && \ | ||
echo -e "\numask 002\n" >> /etc/bashrc_custom && \ | ||
echo -e "\nsource /etc/bashrc_custom\n" >> /etc/bash.bashrc && \ | ||
echo -e "\nsource /etc/bashrc_custom\n" >> /etc/profile && \ | ||
echo -e "\nsource /etc/bashrc_custom\n" >> /root/.bashrc | ||
|
||
# tell bash to source the custom bashrc when a shell is started | ||
ENV BASH_ENV /etc/bashrc_custom | ||
|
||
# Copy source files and requirements file | ||
COPY ./src /src | ||
COPY ./misc /misc | ||
COPY ./paper /paper | ||
COPY ./misc/requirements.txt /misc/requirements.txt | ||
|
||
# Install the required packages | ||
# also include headless opencv | ||
RUN conda install -y pip python=3.8 && \ | ||
pip install -r misc/requirements.txt opencv-python-headless | ||
|
||
# Workdir is the root directory | ||
WORKDIR / | ||
|
||
# Declare data volume | ||
VOLUME /data | ||
|
||
# Declare local volume | ||
VOLUME /local | ||
|
||
# default entrypoint is an non-interactive, non-login shell | ||
ENTRYPOINT ["/bin/bash", "-c"] | ||
|
||
# run some sample commands as default command | ||
CMD ["echo -e 'Hello World'; nvidia-smi"] |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.