-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🗺️ Keras Development Roadmap #19519
Comments
|
I concur with @pure-rgb's observation that the API design has become overly elaborate and intricate for contributors. Previously, it was straightforward and comprehensible, but with the introduction of the Backbone API, it has become more convoluted. |
The person who introduced such API design already left google. :p |
@fchollet I've been going through the features you're planning to implement, and I'm particularly interested in contributing to KerasNLP. Specifically, I'm eager to get involved in the development of dynamic sequence length inference for PyTorch LLMs. |
@kernel-loophole please open an issue on the KerasNLP repo if you'd like to contribute this feature! |
okay thanks |
I am a developer of tensorflow recommenders-addons and I now need to develop an all-to-all embedding layer for multi-GPU distributed training of recommendation models. The old tensorflow distributed strategy clearly did not meet this need. |
@MoFHeka Can you elaborate on what you need here? |
@jeffcarp If a third-party custom op (primitives) is used in Jax training, it will be difficult to convert it to saved_model for online inference. Jax2tf is not easy to use. Unlike TF custom op, it only needs to compile with TFServing or preload the dynamic link library. You may not be very concerned about how DTensor or Jax will evolve in the future. But for now the large number of recommendation models are trained by Keras, I'm interested to hear what I think of these two frameworks as a Keras developer. After all, both frameworks have their own problems for us. |
@MoFHeka Keras is focusing on JAX for distributed training.
Can you elaborate on what ops you need, and what your current workarounds are? cc @hertschuh who is working on recommenders. |
Thank you for your reply. Here is the tensorflow recommenders addons which store and train dynamic shape embedding tables with fully functional hashtable. It’s designed for training ID feature without static hash map. https://github.com/tensorflow/recommenders-addons/blob/master/tensorflow_recommenders_addons/dynamic_embedding/core/ops/hkv_hashtable_ops.cc在 2024年7月12日,04:12,Jeff Carpenter ***@***.***> 写道:
@MoFHeka Keras is focusing on JAX for distributed training.
If a third-party custom op (primitives) is used in Jax training, it will be difficult to convert it to saved_model for online inference
Can you elaborate on what ops you need, and what your current workarounds are?
cc @hertschuh who is working on recommenders.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Dear all It would make life so much easier if Keras 3 supported some of the newer optimizers e.g. Shampoo or Schedule free Adam. Any chance this could go on the roadmap? Thanks! |
Here's an overview of the features we intend to work on in the near future.
Core Keras
Saving & export
Distribution
keras.distribution
.keras.distribution
.Performance
Modeling
tensor.at
operator.keras.ops
:Ecosystem
mlx
branch).KerasHub
The text was updated successfully, but these errors were encountered: