-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RELAY][COMPILER] Initial Relay interpreter and compiler for TVM runtime system. #1954
Conversation
python/tvm/relay/testing/init.py
Outdated
factor_type: str, optional | ||
Can be ``'avg'``, ``'in'``, or ``'out'``. | ||
|
||
magnitude: float, optional |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Optional[float]
src/relay/evaluator.cc
Outdated
} | ||
|
||
Value VisitExpr_(const OpNode* id) override { | ||
// TODO(@jroesch): Eta-expand and return in this case. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix this? it is not hard
src/relay/evaluator.cc
Outdated
Value v = Eval(op->cond); | ||
if (const TensorValueNode* bv = v.as<TensorValueNode>()) { | ||
// TODO(@jroesch): Ask TQ | ||
if (reinterpret_cast<uint8_t*>(bv->data->data)[0]) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there is something in dce that sort of do what we want. maybe reuse.
src/relay/evaluator.cc
Outdated
return Eval(op->false_branch); | ||
} | ||
} else { | ||
throw EvalError("type error, type system should have caught this"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use check as check has stack trace
src/relay/pass/monomorph.cc
Outdated
namespace tvm { | ||
namespace relay { | ||
|
||
using MMCacheKey = std::pair<Expr, Array<Type>>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for compatibility
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
> > for compatibility
52ddd94
to
d39abb3
Compare
@eqy please help review this |
python/tvm/relay/testing/init.py
Outdated
if "depthwise" in name: | ||
factor = 3 * 3 | ||
scale = np.sqrt(self.magnitude / factor) | ||
if self.rnd_type == "uniform": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to allow the user to pass in their own generator/initializer, or should they write a subclass of Initializer
?
python/tvm/relay/to_tvm.py
Outdated
op_attrs = attr.ib() | ||
is_output = attr.ib(default=False) | ||
|
||
def to_json(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will these contain concrete shapes/params for operators, or will we need another way to extract them?
e.g., stride, padding, kernel dims, input dims for conv2d
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should put all the attributes in the operator just like the NNVM format does unless I misunderstood something.
src/relay/pass/monomorph.cc
Outdated
} | ||
}; | ||
|
||
struct MonoMorphizer : ExprMutator { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Is the camel-case really suitable here? "Monomorphize" is one word
d5dec61
to
4d74ced
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did a quick review. I will do another round when the PR is ready. Really nice work. One question, do we have a plan to use a pass manager? It looks that we are starting to add more passes.
include/tvm/relay/base.h
Outdated
CHECK(fdebug) << "Could not find Relay Python debugger function."; \ | ||
(*fdebug)("RELAY_DEBUG", __FILE__, __LINE__, __VA_ARGS__); \ | ||
} | ||
|
||
/*! | ||
* \brief we always used NodeRef for referencing nodes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/We/we
include/tvm/relay/interpreter.h
Outdated
* \brief An interpreter for Relay. | ||
* | ||
* This file implements a simple reference interpreter for Relay programs. | ||
* Given a Relay environment, an a Relay expression it produces a value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/an/and or delete "an"
include/tvm/relay/pass.h
Outdated
@@ -125,10 +142,12 @@ tvm::Array<TypeVar> FreeTypeVars(const Expr& expr); | |||
|
|||
/*! \brief Remove expressions which does not effect the program result. | |||
* | |||
* It will remove let binding that are not referenced, and if branch that are not entered. | |||
* It will remove let binding that are not referenced, and if branch that are |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/branch/branches/
include/tvm/relay/pass.h
Outdated
|
||
// template<typename T> | ||
// struct hash<typename std::enable_if<std::is_base_of<tvm::relay::Expr, T>::value, T>::type> { | ||
// size_t operator()(const T& expr) const { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove the SFINAE if it is not used?
include/tvm/relay/pass.h
Outdated
struct hash<tvm::Array<T>> { | ||
size_t operator()(const tvm::Array<T>& array) const { | ||
size_t hash = std::hash<std::string>()(array->_type_key); | ||
for (auto elem : array) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
auto& ?
include/tvm/tuple.h
Outdated
|
||
template <class Tuple, size_t Index = std::tuple_size<Tuple>::value - 1> | ||
struct TupleForEach { | ||
static void apply(std::vector<NodeRef>& fields, Tuple const& tuple) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand why we need &fields here, but I am not sure if we prefer passing by pointers or not.
python/tvm/relay/op/_tensor.py
Outdated
import topi | ||
from . import register | ||
|
||
def add_compiler(attrs, inputs, output_type): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/add_compiler/add_compute ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we also have target for fcompute?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is not a bad idea to also add target for compute
python/tvm/relay/to_tvm.py
Outdated
|
||
@attr.s | ||
class OpNode(Node): | ||
"""An operator node in the graph representation we lower to before NNVM's graph.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I ask if this graph is more like the one generated after graphcompile? If so, it seems that it not before nnvm. Am I missing something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah sorry this is a typo, this is TVM's graph input.
src/relay/interpreter.cc
Outdated
return ClosureNode::make(captured_env, func); | ||
} | ||
|
||
Value invoke_operator(const Op& op, tvm::Array<Value>& args, const tvm::Array<Type>& type_args, const Attrs& attrs, Type ret_type) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const tvm::Array& args ?
@zhiics Tianqi has been pushing back on implementing a pass manager for the past couple weeks. He feels like we should wait until we have a few more passes and use cases before we go all in on designing the pass manager. I'm happy to build one sooner then later for various reasons which I'm happy to elaborate on. |
include/tvm/relay/expr.h
Outdated
@@ -219,7 +222,8 @@ class FunctionNode : public ExprNode { | |||
TVM_DLL static Function make(tvm::Array<Var> params, | |||
Expr body, | |||
Type ret_type, | |||
tvm::Array<TypeVar> ty_params); | |||
tvm::Array<TypeVar> ty_params, | |||
tvm::Attrs attrs = DictAttrsNode::make({})); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we allow attrs to be None?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah sure, do you want it to default to None? right now the invariant is that it is always DictAttrs since it is most general.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it might not be a bad idea to make it default to None
python/tvm/relay/to_tvm.py
Outdated
@@ -0,0 +1,394 @@ | |||
""" | |||
A compiler from a Relay expression to TVM's graph runtime. | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
graph_runtime_codegen
Add tuple.h Improve Evaluator Complete initial refactor of evaluator Repair evaluator tests Address some feedback and clean up uneeded code. Remove one piece of the hack Repair broken tests Add better module docstring Refactor to make Function attributes immutable This PR changes around how the passes work in order to accomadate immutable attributes Fix failing type_infer test Add fix up analyses in lower_ops.cc Forgot to add renamed to_tvm.py Remove experiment of adding name to Function Fix
heads.append(NodeRef(i).to_json()) | ||
|
||
# Compute "node_row_ptr". | ||
# TODO |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO what?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The above comment, currently I'm using NNVM to fill the field in.
def test_tuple(): | ||
tp = relay.TensorType((10,)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
restore
tests/python/relay/test_evaluator.py
Outdated
|
||
# failing due to numeric issues | ||
|
||
# def test_dense(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
restore or remove
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a numeric precision issue I want feedback on. Can anyone tell me how to get dense to give results near numpy's mult.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the difference very large? We could just check for being within some epsilon of each other. I think it is important that we have tests that use important and complex ops
src/relay/interpreter.cc
Outdated
|
||
struct Frame { | ||
// In the efficient version this should seperate args, locals, and return | ||
// address. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
indent
@eqy @zhiics @tqchen @MarisaKirisame @slyubomirsky @joshpoll this is ready for review if you could all gives some eyeballs, @junrushao1994 @ZihengJiang ? |
I am very interested in this PR. Will do tomorrow! |
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Co-Authored-By: jroesch <roeschinc@gmail.com>
Thanks, @jroesch @junrushao1994 @joshpoll @zhiics @slyubomirsky @MarisaKirisame , this is now merged |
This PR implements the evaluation machinery for Relay, including basic passes for lowering and transforming Relay programs to make them suitable for compilation.
Passes
LoweredFunc
using the appropriate TVM operator and schedule.Interpreter
This PR ships a simple interpreter for Relay programs, which can be used as its reference semantics.
The interpretation of operator calls utilizes TVM compiled operators on CPU, but the evaluation of other expressions is naive.
TVM Graph Runtime.
This PR ships a compiler located in
to_tvm.py
which converts a Relay program into a TVM graph, as well as using the above passes to generate atvm::Module
enabling users to deploy a subset of Relay programs to the TVM RTS.Testing
We ship some more tests for both the evaluator and the RTS, as well as a couple new cases for type inference.
This is blocked on #2026 and #2025.