-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay/TOPI][Frontend] Add tile and repeat operators in Relay and TOPI #2720
Conversation
@tqchen Why does CI failed at tests/python/contrib/test_sort.py? CI passed after I refreshed the PR. |
src/relay/op/tensor/transform.cc
Outdated
// check dimension match | ||
CHECK(!reps.defined()) | ||
<< "repetition array is not defined. data.ndim = " << ndim; | ||
const int rndim = static_cast<int>(reps.size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please keep the type as size_t
, also change index variable to size_t
, e.g., for (size_t i = 0; ...
src/relay/op/tensor/transform.cc
Outdated
bool RepeatRel(const Array<Type>& types, | ||
int num_inputs, | ||
const Attrs& attrs, | ||
const TypeReporter& reporter) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
align the indent?
@@ -316,6 +316,52 @@ def stack(data, axis): | |||
return _make.stack(data, axis) | |||
|
|||
|
|||
def repeat(data, repeats, axis): | |||
"""Repeats elements of an array. | |||
By default, repeat flattens the input array into 1-D and then repeats the elements. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we add an example? I guess we can simply copy from numpy's doc.
|
||
|
||
def tile(data, reps): | ||
"""Repeats the whole array multiple times. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
python/tvm/relay/op/transform.py
Outdated
.. note:: | ||
Each dim size of reps must be a positive integer. If reps has length d, | ||
the result will have dimension of max(d, a.ndim); If a.ndim < d, a is | ||
promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps | |
promoted to be d-dimensional by prepending new axes. If data.ndim >= d, reps |
python/tvm/relay/op/transform.py
Outdated
The input data to the operator. | ||
|
||
reps : tuple of int | ||
The number of times repeating the tensor a. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The number of times repeating the tensor a. | |
The number of times repeating the tensor data. |
python/tvm/relay/op/transform.py
Outdated
Each dim size of reps must be a positive integer. If reps has length d, | ||
the result will have dimension of max(d, a.ndim); If a.ndim < d, a is | ||
promoted to be d-dimensional by prepending new axes. If a.ndim ? d, reps | ||
is promoted to a.ndim by pre-pending 1's to it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is promoted to a.ndim by pre-pending 1's to it. | |
is promoted to data.ndim by prepending 1's to reps. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good to me. just some redundant cast.
src/relay/op/tensor/transform.cc
Outdated
return false; | ||
} | ||
const auto* param = attrs.as<TileAttrs>(); | ||
const size_t ndim = static_cast<size_t>(data->shape.size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
data->shape.size()
is already size_t
, right?, no need to cast.
topi/include/topi/transform.h
Outdated
std::string tag = kBroadcast) { | ||
int ndim = static_cast<int>(x->shape.size()); | ||
int rdim = static_cast<int>(reps.size()); | ||
int tdim = (ndim > rdim) ? ndim : rdim; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Array.size()
is size_t, no need to cast to int.
src/relay/op/tensor/transform.cc
Outdated
// check dimension match | ||
CHECK(!reps.defined()) | ||
<< "repetition array is not defined. data.ndim = " << ndim; | ||
const size_t rndim = static_cast<size_t>(reps.size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
delete cast
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please kick the CI.
apache#2720) * tile and repeat operator added in rely * fix pylint * fix make warnings * comments addressed * fix lint error * comment addressed
apache#2720) * tile and repeat operator added in rely * fix pylint * fix make warnings * comments addressed * fix lint error * comment addressed
tile and repeat are used both in mxnet and numpy. BlockGrad in mxnet during inference is just like drop out which we skip.