From 085ac3d41e6b55524d124675ce9ffc68a34624a5 Mon Sep 17 00:00:00 2001 From: Chee Seng Chan Date: Mon, 11 Oct 2021 22:55:51 +0800 Subject: [PATCH] Update README.md --- README.md | 26 +++++++++++++++++++++++++- 1 file changed, 25 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index b66f6db..fe033c5 100644 --- a/README.md +++ b/README.md @@ -12,7 +12,8 @@ Released on July 20, 2021 This work explores model pruning for image captioning task at the first time. Empirically, we show that 80% to 95% sparse networks can either match or even slightly outperform their dense counterparts. In order to promote Green Computer Vision, we release the pre-trained sparse models for UD and ORT that are capable of achieving CIDEr scores >120 on MS-COCO dataset; yet are only 8.7 MB (reduction of 96% compared to dense UD) and 14.5 MB (reduction of 94% compared to dense ORT) in model size. -

+

+

Figure 1: We show that our deep captioning networks with 80% to 95% sparse are capable to either match or even slightly outperform their dense counterparts.

## Features @@ -173,3 +174,26 @@ You can explore and visualise generated captions [using this Streamlit app](http * Maybe try `pycocotools-fix` instead * This issue may lead to GitHub CI failing, if a different `numpy` version is reinstalled after `pycocotools` is built +## Citation +If you find this work useful for your research, please cite +``` +@article{tan2021end, + title={End-to-End Supermask Pruning: Learning to Prune Image Captioning Models}, + author={Tan, Jia Huei and Chan, Chee Seng and Chuah, Joon Huang}, + journal={Pattern Recognition}, + pages={108366}, + year={2021}, + publisher={Elsevier}, + doi={10.1016/j.patcog.2021.108366} +} +``` + +## Feedback +Suggestions and opinions on this work (both positive and negative) are greatly welcomed. Please contact the authors by sending an email to +`tan.jia.huei at gmail.com` or `cs.chan at um.edu.my`. + +## License and Copyright +The project is open source under BSD-3 license (see the ``` LICENSE ``` file). + +©2021 Universiti Malaya. +