-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay][Docs] Relay Language Reference #2232
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did a first pass, didn't make it over the full documents yet, will try again tomorrow.
allowing the user to bind an expression to a name. | ||
|
||
A `let` binding contains a local variable, | ||
an optional type annotation, a value, and a body expression |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably comment on explicit vs. implicit sharing here @MarisaKirisame
Thanks for the suggested changes, @jroesch, I will incorporate them all later |
@slyubomirsky @tqchen could you tag all the people who reviewed the original one to review this and then we can try and upstream |
@junrushao1994 @zhiics @yzhliu @ZihengJiang @MarisaKirisame Tagging reviewers from original docs PR |
@junrushao1994 @zhiics @yzhliu @ZihengJiang @MarisaKirisame Please comment, I will merge this in 48 hours if there is no additional comment |
docs/langref/relay_expr.rst
Outdated
|
||
.. code-block:: python | ||
|
||
def @plus(%x: Tensor<(10, 10), float32>, %y: Tensor<(10, 10), float32>) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
syntax need to be updated to the new proposed syntax
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which part needs to be changed? I see that the examples @joshpoll posted use square brackets on tensor types and fn
instead of fun
for function literals. I will change those, but are there other inconsistencies?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah and no brackets around if-else blocks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I recalled it was Tensor[(10, 10), float32]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry i was commenting on stale content, but yes we need brackets around if-else
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll put the brackets back then (only two places). The actual ANTLR AST for if-else exprs doesn't appear to require brackets, so we'll have to sync those specs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ANTLR grammar does require brackets. It uses a body expression. test_ir_parser.py
contains examples for each language construct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this example is fine except for the tensor types
docs/langref/relay_expr.rst
Outdated
~~~~~ | ||
|
||
The tuple node builds a finite (that is, of statically known size) sequence of heterogeneous data. | ||
These tuples match Python's closely. Their fixed length allows for efficient projection of their |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In this sense, is Tuple
immutable as in Python?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All Relay values are immutable, so that is true
docs/langref/relay_type.rst
Outdated
learning IRs. Treating tensor shapes as types allows Relay to perform | ||
more powerful reasoning at compile time; in particular, Relay can statically | ||
reason about operations whose output shapes vary based on the input shapes | ||
in complex ways. Statically reasoning about shapes in this manner allows |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you give a concrete example delineating the mechanism of this feature? It's important for people coming from the dataflow background, the NNVM way, to understand what is happening under the hood.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is handled through Relay's "pass" mechanism. I will add a sentence or two about it, but I think a full discussion of it might merit a document of its own. @jroesch, do you think we should have a page on writing Relay passes and how they work?
|
||
See :py:class:`~tvm.relay.ty.Type` for its definition and documentation. | ||
|
||
Tensor Type |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does zero-dim tensor, i.e. scalar, fall into this category? If so, could you please elaborate how it is represented in Relay? I think this is a very important improvement since it's not resolved in NNVM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, scalars are also given tensor types. I will add a sentence making that explicit. (As far as I know, I don't think Relay currently does anything "special" for scalars compared to other tensors.)
docs/langref/relay_type.rst
Outdated
static typing is useful for determining the | ||
|
||
Relay's type system features a form of depending typing for shapes to | ||
replace the dynamic shape inference used in NNVM and most other machine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry. I didn't quite understand this (shouldn't the shape in NNVM is also static?). Could you please elaborate a bit? Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll elaborate on it then. The idea is that shape inference is built into the type system and resolved entirely at compile time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see the point. I thought you were mentioning "dynamic shape". Thanks.
Thanks for putting so many details together. Another topic which is not expanded here but may be of interest to many people from NNVM background is subgraph. While there is a hacky way of supporting subgraphs in NNVM, seems Relay can naturally and gracefully achieve the same purpose. Maybe it deserves a full page of explanation in the future. |
Thanks to all those who have reviewed so far for your input, particularly for requests for clarification and careful checking for typos. I will remain alert to further reviews or comments. |
I think it is fine to mention TempExpr, IncompleteType in a developer facing Expr section to explain the design philosophy(embed partially transformed information in IR). We don't need to mention TypeReporter as it is an implementation detail. |
Very well, I'll add a brief subsection for TempExpr. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some minor comments, i think it is good to go
|
||
Global Functions | ||
~~~~~~~~~~~~~~~~ | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let us add a section about Module, or rename Global Function into Module.
as per @reminisce 's suggestion, mention that the Module mechanism is naturally used to represent subgraphs in traditional deep learning framework's terminology. Give an example of a function calling into sub functions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, will do.
@joshpoll @jroesch @zhiics @reminisce please take another look and https://docs.tvm.ai/contribute/code_review.html#approve-and-request-changes-explicitly |
docs/langref/relay_expr.rst
Outdated
~~~~~~~~~~~~~~~~~~ | ||
|
||
Global identifiers are prefixed by the :code:`@` sigil, such as ":code:`@global`". | ||
A global identifier always references a globally visible definition contained in the environment. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there should probably be a reference to modules here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea
Co-Authored-By: slyubomirsky <slyubomirsky@gmail.com>
@joshpoll @zhiics @reminisce please take another look, if there is no further comment, I will merge this PR in 24 hours |
@slyubomirsky once you fix if-then-else and the reordering suggestion I will approve. |
Co-Authored-By: slyubomirsky <slyubomirsky@gmail.com>
Thanks to @slyubomirsky for the hard work on proposing and improving the PR, thanks to @jroesch @joshpoll @zhiics @reminisce for informative reviews, this is now merged! |
Here is an attempt at revising @jroesch's old Relay documentation for #2212. There have been large changes to the language since the initial version of those comments (see the first commit, which preserved them), so there was a lot to adjust.
Organization of these documents was a tough point, so I would especially appreciate feedback on that. Any feedback on content and clarity (or other relevant points) would be welcome as well.
One change @tqchen recommended on the previous documentation PR was keeping Relay language reference docs in the same directory as the other language reference documents, so I did that here rather than having it in a subdirectory. I also would be interested in discussion as to whether this is a good format.