Skip to content

Latest commit

 

History

History
127 lines (95 loc) · 4.54 KB

README.zh-CN.md

File metadata and controls

127 lines (95 loc) · 4.54 KB
语言: 🇨🇳 🇺🇸

«ZCls»是一个分类模型训练/推理框架

Documentation Status

当前已实现:

更多细节请参考路线图

内容列表

背景

在目标检测/目标分割/动作识别领域,已经出现了许多集成度高、流程完善的训练框架,比如facebookresearch/detectron2, open-mmlab/mmaction2等等。

目标分类是深度学习中最早发展的、理论最基础的领域。参考现有的训练框架,实现一个基于目标分类模型的训练/推理框架。希望ZCls能给你带来更好的实现。

安装

查看INSTALL

用法

关于如何训练,查看Get Started with ZCls

关于如何使用内置数据集,查看Use Builtin Datasets

关于如何使用自定义数据集,查看Use Custom Datasets

使用预训练模型,查看 Use Pretrained Model

主要维护人员

  • zhujian - Initial work - zjykzj

致谢

@misc{ding2021diverse,
      title={Diverse Branch Block: Building a Convolution as an Inception-like Unit}, 
      author={Xiaohan Ding and Xiangyu Zhang and Jungong Han and Guiguang Ding},
      year={2021},
      eprint={2103.13425},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{ding2021repvgg,
      title={RepVGG: Making VGG-style ConvNets Great Again}, 
      author={Xiaohan Ding and Xiangyu Zhang and Ningning Ma and Jungong Han and Guiguang Ding and Jian Sun},
      year={2021},
      eprint={2101.03697},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{fan2020pyslowfast,
  author =       {Haoqi Fan and Yanghao Li and Bo Xiong and Wan-Yen Lo and
                  Christoph Feichtenhofer},
  title =        {PySlowFast},
  howpublished = {\url{https://github.com/facebookresearch/slowfast}},
  year =         {2020}
}

@misc{zhang2020resnest,
      title={ResNeSt: Split-Attention Networks}, 
      author={Hang Zhang and Chongruo Wu and Zhongyue Zhang and Yi Zhu and Haibin Lin and Zhi Zhang and Yue Sun and Tong He and Jonas Mueller and R. Manmatha and Mu Li and Alexander Smola},
      year={2020},
      eprint={2004.08955},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@misc{han2020ghostnet,
      title={GhostNet: More Features from Cheap Operations}, 
      author={Kai Han and Yunhe Wang and Qi Tian and Jianyuan Guo and Chunjing Xu and Chang Xu},
      year={2020},
      eprint={1911.11907},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

更多致谢内容,查看THANKS

参与贡献方式

欢迎任何人的参与!打开issue或提交合并请求。

注意:

许可证

Apache License 2.0 © 2020 zjykzj