Skip to content
This repository has been archived by the owner on Nov 15, 2022. It is now read-only.

K2 ragged tensor cooperation? #249

Open
janvainer opened this issue Sep 29, 2020 · 4 comments
Open

K2 ragged tensor cooperation? #249

janvainer opened this issue Sep 29, 2020 · 4 comments

Comments

@janvainer
Copy link

janvainer commented Sep 29, 2020

Hi, first of all, thanks for this awesome project. It could be largely useful for sequential machine learning with convnets.
I stumbled upon a related project - k2 that also aims at implementing Ragged tensors as part of its toolkit. Since k2 aims to be eventually compatible with pytorch would it make sense to join forces regarding the nested tensor functionality? Do nestedtensor and k2 know about each other? Regards, Jan

@cpuhrsch
Copy link
Contributor

Hello Jan,

Thanks for posting this issue! It'd definitely be interesting to talk about this and figure out whether some of k2's requirements can be met with this project.

Thanks,
Christian

@janvainer
Copy link
Author

janvainer commented Oct 3, 2020

Hi Christian, thanks for your reply. Yes, the projects seem to have overlapping targets. Unfortunately, I am not directly involved the development of any of them (I am eagerly waiting for stable release) and dont know the exact details, so you would probably have to directly contact Dan Povey. I was also curios about the timeline of the nested tensor project. Is there some approximate date of the first stable release (I see end of october in readme, is this info up to date?) ? What are some pain points where help would be needed? Regards, Jan

@cpuhrsch
Copy link
Contributor

cpuhrsch commented Oct 5, 2020

Hello Jan,

We'll have a prototype binary that'll be refreshed every night at the end of October, but it'll be far from stable in terms of API and operator coverage, etc. At this point, I'm spending most of my time working on backend abstractions and integrating recent work such as multi_tensor_apply, which recently landed in pytorch/pytorch. Eventually major operator coverage can come from just a few key abstractions such as TensorIterator. A lot of the shape manipulating operations such as reshape also often can be reduced down to a few key operations. The backend has still been changing a lot, but hopefully eventually we'll get to the point where a lot of key can split out into tasks that have a high probability of being merged and also the contributed code not changed fundamentally further down the road, so that those contributions are actually being valued.

Thanks,
Christian

@janvainer
Copy link
Author

Thanks for the exhaustive reply. I am keeping an eye on thia project. It will be really useful once stable. 😉

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants