Skip to content

Commit

Permalink
update figure links
Browse files Browse the repository at this point in the history
  • Loading branch information
han-cai committed Jul 19, 2023
1 parent 27b50bd commit c6ea569
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 6 deletions.
Binary file added figures/ofa_search_cost.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added figures/predictor_based_search.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added figures/select_subnets.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
12 changes: 6 additions & 6 deletions tutorial/ofa.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"Different sub-nets can directly grab weights from the OFA network without training.\n",
"Therefore, getting a new specialized neural network with the OFA network is highly efficient, incurring little computation cost.\n",
"\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/ofa_search_cost.png)"
"![](../figures/ofa_search_cost.png)"
]
},
{
Expand Down Expand Up @@ -300,7 +300,7 @@
"metadata": {},
"source": [
"## 2. Using Pretrained Specialized OFA Sub-Networks\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/select_subnets.png)\n",
"![](../figures/select_subnets.png)\n",
"The specialized OFA sub-networks are \"small\" networks sampled from the \"big\" OFA network as is indicated in the figure above.\n",
"The OFA network supports over $10^{19}$ sub-networks simultaneously, so that the deployment cost for multiple scenarios can be saved by 16$\\times$ to 1300$\\times$ under 40 deployment scenarios.\n",
"Now, let's play with some of the sub-networks through the following interactive command line prompt (**Notice that for CPU users, this will be skipped**).\n",
Expand Down Expand Up @@ -372,7 +372,7 @@
"(different from the official 50K validation set) so that we do **NOT** need to run very costly inference on ImageNet\n",
"while searching for specialized models. Such an accuracy predictor is trained using an accuracy dataset built with the OFA network.\n",
"\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/predictor_based_search.png)"
"![](../figures/predictor_based_search.png)"
]
},
{
Expand Down Expand Up @@ -657,7 +657,7 @@
"source": [
"**Notice:** You can further significantly improve the accuracy of the searched sub-net by fine-tuning it on the ImageNet training set.\n",
"Our results after fine-tuning for 25 epochs are as follows:\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/diverse_hardware.png)\n",
"![](../figures/diverse_hardware.png)\n",
"\n",
"\n",
"### 3.2 FLOPs-Constrained Efficient Deployment\n",
Expand Down Expand Up @@ -915,8 +915,8 @@
"**Notice:** Again, you can further improve the accuracy of the search sub-net by fine-tuning it on ImageNet.\n",
"The final accuracy is much better than training the same architecture from scratch.\n",
"Our results are as follows:\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/imagenet_80_acc.png)\n",
"![](https://hanlab.mit.edu/files/OnceForAll/figures/cnn_imagenet_new.png)\n",
"![](../figures/imagenet_80_acc.png)\n",
"![](../figures/cnn_imagenet_new.png)\n",
"\n",
"Congratulations! You've finished all the content of this tutorial!\n",
"Hope you enjoy playing with the OFA Networks. If you are interested, please refer to our paper and GitHub Repo for further details.\n",
Expand Down

0 comments on commit c6ea569

Please sign in to comment.