-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TVM 0.4 Release Note #1577
Comments
Thanks to everyone who have pushed to last release cycle in the past three months. We would like to propose the release of v0.4 on Aug 13th. We encourage everyone in the community to put their weights to review and vote the release. @dmlc/tvm-team Please reply this thread on
|
Operator Fusion enhancement to nnvm is missing in the release note ! |
@masahi just added that |
@tqchen fusion now is a separate pass. |
@zhiics thanks for pointing this out, just added that to release note |
GraphRuntime support for tvm4j - E2E inference in Java! |
broadcast operators like not_equal, greater_equal and less_equal is now supported in both nnvm and topi. |
v0.4 has been tagged https://github.com/dmlc/tvm/releases/tag/v0.4 |
v0.5 roadmap is available at #1596 |
This release features several major improvements. The high-level graph optimizer is now part of TVM repo. Some of the highlights are: Initial support of AutoTVM for automated optimization; customized accelerator backend VTA. Please also check out tvm.ai for latest blogposts.
The community welcomes new reviewers @kazum @alex-weaver @masahi @zhreshold @PariksheetPinjari909 @srkreddy1238 @eqy, new code owner @merrymercy, and new committer @yzhliu
Change List
Tensor Expression and Optimization
Backend
Runtime
-Support tracker in Android RPC, add fault tolerance for AutoTVM
NNVM
Misc
Contributors
See the complete list here. Thanks to all the contributors to contribute to this release.
Code reviewers
Compiler
TOPI, graph optimization
Frontends
Deploy
The text was updated successfully, but these errors were encountered: