Skip to content
This repository has been archived by the owner on Apr 28, 2023. It is now read-only.

How to express Broadcast with TC language? #626

Open
ghostplant opened this issue Oct 22, 2019 · 6 comments
Open

How to express Broadcast with TC language? #626

ghostplant opened this issue Oct 22, 2019 · 6 comments

Comments

@ghostplant
Copy link

Most of TC examples are actually for tensor reshape or reduce. Any examples for tensor broadcast?

The following try isn't supported by TC:

def broadcast(float(N, M) I0) -> (O) {
  O(n, m, k) = I0(n, m)
}

or

def broadcast(float(N, M) I0) -> (float(N, M, K) O) {
  O(n, m, k) = I0(n, m)
}
@shekhovt
Copy link

shekhovt commented Nov 12, 2019

It seems that output size specification is not supported #377. I think this one can be solved with

def broadcast(float(N, M) I0, int K) -> (O) {
O(n, m, k) = I0(n, m) where k in 0:K
}

(I am looking at the reference https://facebookresearch.github.io/TensorComprehensions/semantics.html)
I haven't tried this yet.

@ghostplant
Copy link
Author

@shekhovt That's great, thank you! What about one_hot which has if condition,
something like: O(x, k) = (k == A(x)) ? 1 : 0?

@shekhovt
Copy link

Note that you can use look-up on the right hand side, like
Loss += Logits(b,Target(b))
Just to say that expanding everything by broadcasting or one_hot might not be needed.

As I understand it, the language allows to do multi-dimensional reductions and lookups, but not going to be very useful for simple standalone operations you mentioned. There would be an implementation in pytorch or it would be straightforward to do it with CUDA extensions https://pytorch.org/tutorials/advanced/cpp_extension.html

@ghostplant
Copy link
Author

@shekhovt For broadcast:

def broadcast(float(N, M) I0, int K) -> (O) {
    O(n, m, k) = I0(n, m) where k in 0:K
}

How should I fill the second parameter? It is not a tensor but a scaler. If I didn't fill this value:

terminate called after throwing an instance of 'lang::ErrorReport'
  what():
expected ) but found 'ident' here::
def broadcast(float(N, M) I0, int K) -> (O) {
                                  ~ <--- HERE
    O(n, m, k) = I0(n, m) where k in 0:K
}

@shekhovt
Copy link

Can you actually compile TC with new CUDA and pytorch? I got now conda install TC with pytorch 0.3.1.post3 :( Does not seem very useful even if it can implement and autotune the op.

@ghostplant
Copy link
Author

ghostplant commented Nov 15, 2019

@shekhovt No, I am just using CUDA9.0 + Pytorch < 1.0 + TC and make them in docker to avoid the environment pollution, because I only need the source code it generates, which can satisfy my requirement.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants