-
Notifications
You must be signed in to change notification settings - Fork 40
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 4eb7e36
Showing
5 changed files
with
991 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,35 @@ | ||
# ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression | ||
|
||
|
||
Pretrained [caffe](https://github.com/BVLC/caffe) model of ICCV'17 paper: | ||
|
||
"[ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression](https://arxiv.org/abs/1707.06342)" | ||
|
||
For more details, please see our project page: [[ThiNet Project Page]](http://lamda.nju.edu.cn/luojh/project/ThiNet_ICCV17/ThiNet_ICCV17.html) | ||
|
||
|
||
## Models | ||
|
||
center 224x224 crop validation accuracy on ImageNet. | ||
|
||
| Model | Top-1 | Top-5 | #Param. | #FLOPs | f./b. (ms) | | ||
| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- | | ||
| [ThiNet-GAP](http://lamda.nju.edu.cn/luojh/project/ThiNet_ICCV17/caffe_model/ThiNet-GAP.zip) | 67.34% | 87.92% | 8.32M | 9.34B | 71.73/145.51 | | ||
| [ThiNet-Tiny](http://lamda.nju.edu.cn/luojh/project/ThiNet_ICCV17/caffe_model/ThiNet-Tiny.zip) | 59.34% | 81.97% | 1.32M | 2.01B | 29.51/55.83 | | ||
**Note: These two models are trained with different image cropping method, see ``trainval.prototxt`` for more details.** | ||
|
||
|
||
|
||
## Citation | ||
If you find this work useful for your research, please cite: | ||
``` | ||
@article{ThiNet_ICCV17, | ||
title={ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression}, | ||
author={Jian-Hao Luo, Jianxin Wu, and Weiyao Lin}, | ||
journal={arXiv preprint arXiv:1707.06342}, | ||
year={2017} | ||
} | ||
``` | ||
|
||
## Contact | ||
Feel free to contact me if you have any question (Jian-Hao Luo luojh@lamda.nju.edu.cn). |
Binary file not shown.
Oops, something went wrong.