Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

NNI 2021 Oct~Nov Iteration Plan #4211

Closed
39 of 86 tasks
scarlett2018 opened this issue Sep 26, 2021 · 6 comments
Closed
39 of 86 tasks

NNI 2021 Oct~Nov Iteration Plan #4211

scarlett2018 opened this issue Sep 26, 2021 · 6 comments
Assignees

Comments

@scarlett2018
Copy link
Member

scarlett2018 commented Sep 26, 2021

This is the plan for the iteration in 2021 Nov~Dec Iteration, it's a 6 weeks iteration.

Release Plan for upcoming release

  • Release manager: @liuzhe-lz
  • Feature freeze date: 11-19 11-26 (last delay)
  • Code freeze and demo date: 12-3 12-8 12-10 (last delay)
  • Branch cut and next release planning date: 12-7 12-24
  • Release date: 12-10 12-29

Documentation Update

NAS

Model Compression

HPO

nnictl

NNI manager & training service

WebUI

Other

  • readthedocs dependency
  • doc translation

Deferred

NAS

  • P1 - @JiahangXu a TBD end2end example for NAS with Retiarii
  • P1 - @QuanluZhang Support tuner_no_more_trial
  • P1 - @JiahangXu review graph engine
  • P? - @ultmaster Update mnasnet and example refactor (plan: NNI NAS example refactor plan #4249)
  • P2 - (zebin) More choice type design
  • P2 - (zebin) Survey naslib
  • P2 - (zebin) Port Houwen's algorithm
  • P2 - @liuzhe-lz Merge RetiariiExperiment back to Experiment
  • P3 - Visualization nasbench
  • P3 - Support HPO in Retiarii
  • P3 - Review one-shot, refactor with lightning

Compression

  • Pruning Speed Up (check with ningxin)
  • P1 - Fine-grained iterative speedup logic.
  • P1 - NetAdapt Pruner
  • @J-shang Transformer Pruner. (11.5 11.8) [Model Compression] transformer pruner #4180
  • @zheng-ningxin Function replacement
  • P1 - Sensitivity Pruner.
  • @J-shang ADMM: support more metrics.
  • P2 - Support min_sparsity_per_layer in global mode.
  • P2 - Mixed sparse pattern.
  • P2 - Support mask output/input.
  • P2 - Pruning with constraints (remain 4x channels).
  • P2 - @Fiascolsy Iterative pruning starts from any sparsity ratio.
  • P2 - KD in pruning scheduler (maybe use original model as teacher).
  • @zheng-ningxin Memory usage optimization
  • Refactor of model graph generation (as an independent component)
  • Speed up with special/customized layer.
  • Quantization V2

HPO

  • P1 - fix len(arr) > 0 problem
  • P1 - (zebin) Improve hyperband & PBT with shared storage
  • P3 - Multi-objective
  • P2 - (zebin) transferable HPO
  • P2 - Evolution nested search space

nnictl

NNI Manager

  • P2 - Distributed trial
  • P1 - @acured Project back-end

Web UI

@scarlett2018 scarlett2018 pinned this issue Sep 26, 2021
@liuzhe-lz liuzhe-lz changed the title [Draft] NNI 2021 Oct~Nov Iteration Plan NNI 2021 Oct~Nov Iteration Plan Oct 27, 2021
@chenbohua3
Copy link
Contributor

Hi @scarlett2018 I want to add these items for this iteration:

  • About quantization:

    • Support dtype & qscheme customization for lsq quantizer and observer quantizer (P0)
    • upgrade lsq quantizer to lsq+ quantizer, which can used for asymmetric quantization (P0)
    • add model converter for mnn backend (P1)
  • About architecture for quantization V2:

    • Analyze the architecture of the PyTorch quantization tool in the open source community, and summarize the advantages and disadvantages of each tool.(P0), with @linbinskn
  • A demo case for quantizing YOLOX and deploy it on PyTorch backend (P1)

@J-shang J-shang unpinned this issue Nov 22, 2021
@liuzhe-lz liuzhe-lz pinned this issue Nov 24, 2021
@liuzhe-lz
Copy link
Contributor

liuzhe-lz commented Dec 13, 2021

Bug Bash

NAS

Model Compression

HPO

nnictl

Training Service

  • @liuzhe-lz Hybrid for all reuse training services

Web UI

New Items

@liuzhe-lz
Copy link
Contributor

liuzhe-lz commented Jan 7, 2022

Release Note

NOTE: NNI v2.6 is the last version that supports Python 3.6. From next release NNI will require Python 3.7+.

HPO

Experiment

  • The legacy experiment config format is now deprecated. (doc of new config)
    • If you are still using legacy format, nnictl will show equivalent new config on start. Please save it to replace the old one.
  • nnictl now uses nni.experiment.Experiment APIs as backend. The output message of create, resume, and view commands have changed.
  • Added Kubeflow and Frameworkcontroller support to hybrid mode. (doc)
  • The hidden tuner manifest file has been updated. This should be transparent to users, but if you encounter issues like failed to find tuner, please try to remove ~/.config/nni.

Algorithms

  • Random tuner now supports classArgs seed. (doc)
  • TPE tuner is refactored: (doc)
    • Support classArgs seed.
    • Support classArgs tpe_args for expert users to customize algorithm behavior.
    • Parallel optimization has been turned on by default. To turn it off set tpe_args.constant_liar_type to null (or None in Python).
    • constant_liar_type has been moved into tpe_args. If you are using it please update your config.
  • Grid search tuner now supports all search space types, including uniform, normal, and nested choice. (doc)

NAS

  • Enhancement to serialization utilities (doc) and changes to recommended practice of customizing evaluators. (doc)
  • Support latency constraint on edge device for ProxylessNAS based on nn-Meter. (doc)
  • Trial parameters are showed more friendly in Retiarii experiments.
  • Refactor NAS examples of ProxylessNAS and SPOS.

Model Compression

  • New Pruner Supported in Pruning V2
    • Auto-Compress Pruner (doc)
    • AMC Pruner (doc)
    • Movement Pruning Pruner (doc)
  • Support nni.trace wrapped Optimizer in Pruning V2. In the case of not affecting the user experience as much as possible, trace the input parameters of the optimizer. (doc)
  • Optimize Taylor Pruner, APoZ Activation Pruner, Mean Activation Pruner in V2 memory usage.
  • Add more examples for Pruning V2.
  • Add document for pruning config list. (doc)
  • Parameter masks_file of ModelSpeedup now accepts pathlib.Path object. (Thanks to dosemeion) (doc)
  • Bug Fix
    • Fix Slim Pruner in V2 not sparsify the BN weight.
    • Fix Simulator Annealing Task Generator generates config ignoring 0 sparsity.

Documentation

  • Supported GitHub feature "Cite this repository".
  • Updated index page of readthedocs.
  • Updated Chinese documentation.
    • From now on NNI only maintains translation for most import docs and ensures they are up to date.
  • Reorganized HPO tuners' doc.

Bugfixes

  • Fixed a bug where numpy array is used as a truth value. (Thanks to khituras)
  • Fixed a bug in updating search space.
  • Fixed a bug that HPO search space file does not support scientific notation and tab indent.
    • For now NNI does not support mixing scientific notation and YAML features. We are waiting for PyYAML to update.
  • Fixed a bug that causes DARTS 2nd order to crash.
  • Fixed a bug that causes deep copy of mutation primitives (e.g., LayerChoice) to crash.
  • Removed blank at bottom in Web UI overview page.

@microsoft microsoft deleted a comment from J-shang Jan 17, 2022
@ultmaster
Copy link
Contributor

ultmaster commented Jan 17, 2022

NAS

  • Enhancement to serialization utilities (doc) and changes to recommended practice of customizing evaluators (doc)

Bug fixes

  • Fix a bug that causes DARTS 2nd order to crash.
  • Fix a bug that causes deep copy of mutation primitives (e.g., LayerChoice) to crash.

@JiahangXu
Copy link
Contributor

NAS

  • Refactor NAS examples of ProxylessNAS and SPOS.
  • Support latency constraint on edge device for ProxylessNAS based on nn-Meter.

@liuzhe-lz liuzhe-lz unpinned this issue Jan 21, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants