Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tf.addons.text ops coverage #11

Closed
bhack opened this issue Jan 18, 2022 · 5 comments
Closed

tf.addons.text ops coverage #11

bhack opened this issue Jan 18, 2022 · 5 comments

Comments

@bhack
Copy link

bhack commented Jan 18, 2022

Are you interested to cover some of the ops in this namespace (expecially CRF):

https://www.tensorflow.org/addons/api_docs/python/tfa/text

@mattdangerw
Copy link
Member

Thanks for bringing this up! This is helpful!

I am unsure of how we could monitor for usage of the TFA utilities, but can poke around. That could help us prioritize well here.

Re CRF, did you have a design in mind?

I think the highest priority symbol of the bunch might be the skip_gram* functions. We have these in TFA, and another similar version in keras.preprocessing. This would be better off as a preprocessing layer that could replace both of those usages. word2vec is definitely a common used model we would like to enable easy support for with this library.

I think a skip gram preprocessing layer, with the goal of replacing the tf.keras.preprocessing.sequence.skipgrams usage in the official tensorflow word2vec tutorial, is a good goal.

@bhack
Copy link
Author

bhack commented Jan 26, 2022

I am unsure of how we could monitor for usage of the TFA utilities, but can poke around. That could help us prioritize well here.

We have a quite old and long thread about this at:
tensorflow/addons#236

Re CRF, did you have a design in mind?

I think that @howl-anderson, our new CRF codeowner, could give use some feedback.

I think the highest priority symbol of the bunch might be the skip_gram* functions. We have these in TFA, and another similar version in keras.preprocessing. This would be better off as a preprocessing layer that could replace both of those usages.

This has an (old) custom op in TFA so I hope that you can handle it here with a pure compositional python/TF solution.

We have also an rnn and a seq2seq namespace that you could evaluate in this context:
https://www.tensorflow.org/addons/api_docs/python/tfa/rnn
https://www.tensorflow.org/addons/api_docs/python/tfa/seq2seq

/cc @seanpmorgan @yarri-oss @theadactyl

@Stealth-py
Copy link
Contributor

Hey @mattdangerw! I wanted to ask if the keras team is still interested in implementing a preprocessing layer for models such as the skipgram model.

@bhack
Copy link
Author

bhack commented Apr 8, 2022

/cc @seanpmorgan @yarri-oss

@bhack
Copy link
Author

bhack commented Apr 22, 2022

@bhack bhack closed this as completed Apr 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants