Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TF Addons CV components #74

Closed
bhack opened this issue Jan 31, 2022 · 17 comments
Closed

TF Addons CV components #74

bhack opened this issue Jan 31, 2022 · 17 comments

Comments

@bhack
Copy link
Contributor

bhack commented Jan 31, 2022

Are you interested to cover or migrate some of the CV components in the TF Addons namespace?

I suppose the starting point could be to review tf.addons.image as we are already duplicated in this repo things like coutout/randomcoutout etc..:
https://www.tensorflow.org/addons/api_docs/python/tfa/image

P.s. This is a parallel ticket of keras-team/keras-hub#11

@bhack
Copy link
Contributor Author

bhack commented Jan 31, 2022

/cc @seanpmorgan @yarri-oss @theadactyl

@LukeWood
Copy link
Contributor

LukeWood commented Feb 1, 2022

Sure, I will do a pass over the components and see what we are interested in taking.

@bhack
Copy link
Contributor Author

bhack commented Feb 10, 2022

We need to also to evaluate if we take the ownership of the tf.image namespace and handle the deprecation in Tensorflow:
https://www.tensorflow.org/api_docs/python/tf/image

See more (/cc @qlzh727 ):
#122 (comment)

@qlzh727
Copy link
Member

qlzh727 commented Feb 10, 2022

Checked with @fchollet offline. It seems that we don't have an active owner for tf.image API, but in the same time, we can't easily remove the TF API due to backward compatibility contract. Deprecating certain duplicated tf.image API could be an approach, until keras-cv has enough API/feature coverage.

@bhack
Copy link
Contributor Author

bhack commented Feb 10, 2022

Yes I think any public TF API removal requires a deprecation window:

https://github.com/tensorflow/community/blob/master/governance/api-reviews.md

@bhack

This comment was marked as duplicate.

@LukeWood
Copy link
Contributor

LukeWood commented Apr 9, 2022

Is there any reason to leave this open @bhack ? I’d like to begin triaging issues and closing more thoroughly. Feel free to reopen inquiries about specific components.

@LukeWood LukeWood closed this as completed Apr 9, 2022
@bhack
Copy link
Contributor Author

bhack commented Apr 9, 2022

Feel free to reopen inquiries about specific components.

If you close this one that it is general enough then I need to post to every specific ticket or PR that is going to create any sort of potential duplication with Addons like #289.

I honestly avoided doing this on every single ticket in these months cause I was thinking that we could sit down and calmly assess the situation as we have already "duplicated" or superseded many components here since February.

But we have not collected any specific position after 2 months and it's a bit of a waste of resources.

@LukeWood
Copy link
Contributor

LukeWood commented Apr 9, 2022

Okay, we can reopen this then and discuss.

@LukeWood LukeWood reopened this Apr 9, 2022
@LukeWood
Copy link
Contributor

LukeWood commented Apr 17, 2022

@bhack for organizational purposes, can we keep discussions regarding duplication to this issue? I don't want to confuse users and have them close issues that they should not be closing. Re: the focal loss discussion here:

@sebastian-sz Honestly It was the scope of #74 to do this in an orderly manner but it seems that in the end it is quite recurrent to never manage this kind of things in a clean way.

What do you have in mind regarding a clean and orderly way to manage migration? As it stands now, we are opening PRs here, holding API design reviews, and pulling ideas from various reference implementations. To me, this is organized, orderly, and most importantly delivers the best end user API.

Please raise any issues with this process here.

@bhack
Copy link
Contributor Author

bhack commented Apr 17, 2022

I've a quite clear view on how to do this but I've reached the limit with my OSS/Gitbub visiblity.

I personally hoped not to make TF-Addons slowly obsolete PR after PR on this repository just out of respect for the many years invested by its community of volunteers.

I've tried to open this to sit down and identify how many TF addons components we want to replicate/refactor here and to eventually put TFAddons in a sort of maintenance mode but with a day by day (or PR by PR) approach we are not able to do this.

I hope you could follow-up on this internally with the other TF members I've already mentioned in this ticket.

Thanks

@LukeWood
Copy link
Contributor

Hi @bhack - I've done some reading, discussion, etc and have the following response to the issue about duplication of components.

The role of TF-Addons doesn't change -- a community-contributed set of extensions to TF and Keras APIs that are too niche to be included in core TF or core Keras (including KerasCV and KerasNLP).

It has always been the case that TF-Addons components are expected to potentially migrate to core if there's interest. For instance, that's what we did with the AdamW optimizer recently (although we reimplemented it rather than reusing the TF-Addons version). KerasCV and KerasNLP are basically the same as core Keras, just domain-specific, so the same expectations apply. We will migrate or reimplement TF-Addons components there if there's a need.

However this will always be limited to a minority of TF-Addons components, and it does not affect TF-Addons' purpose and positioning. There is no risk of TF-Addons being "taken over" or "made irrelevant". There will always be a need for a repository of extensions that are too niche for the core API.
Rest assured, TF-Addons is still VERY relevant and important! KerasCV and KerasNLP are not a replacement for TF-Addons. They are not addons for TF; they are the same as the core Keras API, just more domain specific.

Finally, it is normal and expected that TF-Addons components will end up in the core APIs. This should be considered a success for the contributor, as their component is highly impactful and this means there is a clear need for the component!

Please let me know if there are any more questions. I want to make sure to effectively communicate with all parties involved.

@bhack
Copy link
Contributor Author

bhack commented Apr 20, 2022

The role of TF-Addons doesn't change -- a community-contributed set of extensions to TF and Keras APIs that are too niche to be included in core TF or core Keras (including KerasCV and KerasNLP).

This was historically very confusing and we are full of examples in these years with components duplicated, in some case just after few weeks from when the component was reviewed and contributed in Addons.

Also we have almost the same papers citations threshold in Keras-CV and Keras-NLP and TF Addons so I have some strong doubt about the "too niche" claim as I think that not so much people are interested to maintain "too niche" contributions. For this it is better to maintain the components in your own repository as the codeownreship doesn't scale.

that's what we did with the AdamW optimizer recently (although we reimplemented it rather than reusing the TF-Addons version).

I needed to personally route the contributor from TF Addons to Keras for this (see tensorflow/addons#2681 (comment)).

However this will always be limited to a minority of TF-Addons components, and it does not affect TF-Addons' purpose and positioning.

I don't have the time now to compile again the list of components that I need to subtract from TF Addons and that are now already here in master but it is already quite long for sure. Probably with a slightly different API or with other implementation details but this wasn't the point here.

Finally, it is normal and expected that TF-Addons components will end up in the core APIs. This should be considered a success for the contributor, as their component is highly impactful and this means there is a clear need for the component!

Almost all the cases are not about a porting but a duplication as it often happen in the TF ecosystem. See the history of a process that it has never worked:

tensorflow/community#239

tensorflow/community#242

@fchollet
Copy link
Member

Hi Stefano,

Our key point here is that TF Addons serves a valuable purpose in the ecosystem, and the purpose of KerasCV is not to be replacement for TF Addons CV tooling. We hope that folks will contribute to TF Addons and KerasCV. We value all contributions.

It is inevitable that there will be some duplication of functionality. That's fine, and we can live with it. In any large ecosystem, there will always be some level of duplication.

If a component gets adopted by core APIs, that doesn't mean the TF Addons version stops being useful or should be removed. For instance there are many workflows today that use AdamW from TF Addons. It will probably take ~2 years for most users to transition to the newly introduced core version.

It just means that we think there should be a core version of the component. And that judgment call is not affected by whether something is already in TF Addons or not.

For instance, we added AdamW to Keras because we found that lots of people were using it (e.g. on keras.io). Critically, it is the availability of AdamW in TF Addons that helped pointing out the need. And further, when we added AdamW, we learned from the API and implementation details of the TF Addons version. Overall, TF Addons played an important role, and our end state is one that is nicer for every Keras user (just import keras.optimizers.AdamW).

Almost all the cases are not about a porting but a duplication as it often happen in the TF ecosystem. See the history of a process that it has never worked:

There is no functional difference between migrating a component and reimplementing it. What matters is the role the component plays in a workflow and how people use it, not the specific code used to implement it. The point of "staging" a component in a package that has low backwards-compatibility requirements is not to try out the code, it's to try out the API, and to hedge for whether the component will end up being broadly useful or not. The point is to learn something, which we do, thanks to TF Addons. The fact that we don't migrate components via copy/pasting is not important.

@bhack
Copy link
Contributor Author

bhack commented Apr 22, 2022

I honestly don't think that cerry-picking arguments to support one's thesis that doesn't adhere to what happens every day in TF/Keras repositories and Github Orgs will help us to solve real problems that don't want to be acknowledged.
I think we are not aligned as we have two different points of view:

the competition between internal teams is natural and also encouraged I presume but from the external point of view, as a community, we would like to cooperate in the ecosystem and we have no interest in competing within the same ecosystem. The TF Ecosystem has not a so large mass of external contributors to care about redundancies.

I close this ticket because given the approach to the discussion I believe that everyone should think about the fate of their project independently.

Honestly I have no specific critic about Keras-CV or Keras-NLP if these project will be really community driven and they will improve the aligment between the internal team members and contributors also in the codeownership. I'have already collaborated a bit in Keras-cv but I will not contribute single components personally untill It will be clear the TF-Addons status.

@seanpmorgan the current TFAddons SIG lead will evaluate what to do in coordination with the TF SIGs program managers.

P.s. I don't know why you focus on copy pasting component. My point in these year was never about to copy-paste components but related to the API and to contributors routing.
Also my point was always related to not waste the very limited community resourcers and confusing contributors.

@bhack
Copy link
Contributor Author

bhack commented May 3, 2022

There is no functional difference between migrating a component and reimplementing it.

tensorflow/tensorflow#55824

freedomtan pushed a commit to freedomtan/keras-cv that referenced this issue Jul 20, 2023
* Add max and poolig layer

* fix tests

* handle TF transpose

* renaming

* initial

* add something

* rename tests

* more

* add docstring

* add mask for global average pooling 1d
@jamesmyatt
Copy link

jamesmyatt commented Nov 3, 2023

Does this issue need to be reopened now that TFA is deprecated? tensorflow/addons#2807

Is there a systematic mapping from tfa.image to KerasCV functions (including which ones will not be included or are in other libraries)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants