-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ndarray: more Base-like APIs #303
Conversation
- also fix the axis value mapping - `mean(arr, axis=0)` is not Julian
24cb30c
to
4d8dfc4
Compare
Codecov Report
@@ Coverage Diff @@
## master #303 +/- ##
==========================================
- Coverage 70.3% 69.27% -1.03%
==========================================
Files 25 25
Lines 1926 1894 -32
==========================================
- Hits 1354 1312 -42
- Misses 572 582 +10
Continue to review full report at Codecov.
|
min, max, etc? |
- remove `mx.max`, `mx.min`, `mx.max_axis` and `mx.min_axis`
Done for min and max. Anything else I missed? |
sum
and mean
be Base-likef62d2d4
to
7e19bec
Compare
I'm searching the doc for the keyword 'dim' or 'axis', it seems there are a lot of operators to consider (e.g. |
I guess that docstring is the only set we can search. doesnt see any API about the function signature or similar. I still prefer to do it manually.
|
for discovering non-Julian APIs
@pluskid please checkout 8c5b6cf julia> mx._sig_checker()
WARNING: BatchNorm(data, gamma, beta, moving_mean, moving_var, eps, momentum, fix_gamma, use_global_stats, output_mean_var, axis, cudnn_off)
WARNING: CuDNNBatchNorm(data, eps, momentum, fix_gamma, use_global_stats, output_mean_var, axis, cudnn_off)
WARNING: Deconvolution(data, weight, bias, kernel, stride, dilate, pad, adj, target_shape, num_filter, num_group, workspace, no_bias, cudnn_tune, c
udnn_off, layout)
WARNING: GridGenerator(data, transform_type, target_shape)
WARNING: SliceChannel(data, num_outputs, axis, squeeze_axis)
WARNING: Softmax(data, grad_scale, ignore_label, multi_output, use_ignore, preserve_shape, normalization, out_grad, smooth_alpha)
WARNING: SoftmaxOutput(data, label, grad_scale, ignore_label, multi_output, use_ignore, preserve_shape, normalization, out_grad, smooth_alpha)
WARNING: SpatialTransformer(data, loc, target_shape, transform_type, sampler_type)
WARNING: _backward_slice_axis()
WARNING: _ones(shape, ctx, dtype)
WARNING: _random_exponential(lam, shape, ctx, dtype)
WARNING: _random_gamma(alpha, beta, shape, ctx, dtype)
WARNING: _random_generalized_negative_binomial(mu, alpha, shape, ctx, dtype)
WARNING: _random_negative_binomial(k, p, shape, ctx, dtype)
WARNING: _random_normal(loc, scale, shape, ctx, dtype)
WARNING: _random_poisson(lam, shape, ctx, dtype)
WARNING: _random_uniform(low, high, shape, ctx, dtype)
WARNING: _sample_exponential(lam, shape, dtype)
WARNING: _sample_gamma(alpha, shape, dtype, beta)
WARNING: _sample_generalized_negative_binomial(mu, shape, dtype, alpha)
WARNING: _sample_multinomial(data, shape, get_prob, dtype)
WARNING: _sample_negative_binomial(k, shape, dtype, p)
WARNING: _sample_normal(mu, shape, dtype, sigma)
WARNING: _sample_poisson(lam, shape, dtype)
WARNING: _sample_uniform(low, shape, dtype, high)
WARNING: _sparse_sum(data, axis, keepdims, exclude)
WARNING: _square_sum(data, axis, keepdims, exclude)
WARNING: _zeros(shape, ctx, dtype)
WARNING: argmax(data, axis, keepdims)
WARNING: argmin(data, axis, keepdims)
WARNING: argsort(data, axis, is_ascend)
WARNING: broadcast_axes(data, axis, size)
WARNING: broadcast_axis(data, axis, size)
WARNING: broadcast_to(data, shape)
WARNING: expand_dims(data, axis)
WARNING: flip(data, axis)
WARNING: log_softmax(data, axis)
WARNING: nanprod(data, axis, keepdims, exclude)
WARNING: nansum(data, axis, keepdims, exclude)
WARNING: normal(loc, scale, shape, ctx, dtype)
WARNING: pick(data, index, axis, keepdims)
WARNING: random_exponential(lam, shape, ctx, dtype)
WARNING: random_gamma(alpha, beta, shape, ctx, dtype)
WARNING: random_generalized_negative_binomial(mu, alpha, shape, ctx, dtype)
WARNING: random_negative_binomial(k, p, shape, ctx, dtype)
WARNING: random_normal(loc, scale, shape, ctx, dtype)
WARNING: random_poisson(lam, shape, ctx, dtype)
WARNING: random_uniform(low, high, shape, ctx, dtype)
WARNING: repeat(data, repeats, axis)
WARNING: reshape_like(lhs, rhs)
WARNING: reverse(data, axis)
WARNING: sample_exponential(lam, shape, dtype)
WARNING: sample_gamma(alpha, shape, dtype, beta)
WARNING: sample_generalized_negative_binomial(mu, shape, dtype, alpha)
WARNING: sample_multinomial(data, shape, get_prob, dtype)
WARNING: sample_negative_binomial(k, shape, dtype, p)
WARNING: sample_normal(mu, shape, dtype, sigma)
WARNING: sample_poisson(lam, shape, dtype)
WARNING: sample_uniform(low, shape, dtype, high)
WARNING: scatter_nd(data, indices, shape)
WARNING: slice_axis(data, axis, begin, end)
WARNING: softmax(data, axis)
WARNING: sort(data, axis, is_ascend)
WARNING: split(data, num_outputs, axis, squeeze_axis)
WARNING: stack(data, axis, num_args)
WARNING: sum_axis(data, axis, keepdims, exclude)
WARNING: swapaxes(data, dim1, dim2)
WARNING: take(a, indices, axis, mode)
WARNING: topk(data, axis, k, ret_typ, is_ascend)
WARNING: uniform(low, high, shape, ctx, dtype) |
I can keep refining those APIs, but at this moment, I want to make a new release first. So I won't add more |
end | ||
|
||
""" | ||
libmxnet operators signature checker. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add more desc to the docstring?
Thanks! Merged. But can you add a more detailed docstring for |
sure. |
It's #305 |
remapping
mean
sum
maximum
minimum
permutedims
prod
also fix the
axis
value mappingmean(arr, axis=0)
is not Julian