Skip to content

Conversation

@lbluque
Copy link
Contributor

@lbluque lbluque commented Dec 18, 2025

The next ray.serve release will include the option to use a function to determine batch sizes based on input data. ray-project/ray#59059

This PR edits our code to use total number of atoms to determine batch sizes in the BatchPredictServer.

@lbluque lbluque added enhancement New feature or request patch Patch version release labels Dec 18, 2025
@meta-cla meta-cla bot added the cla signed label Dec 18, 2025
@lbluque lbluque marked this pull request as draft December 18, 2025 23:43
@lbluque lbluque marked this pull request as ready for review January 6, 2026 20:59
@lbluque lbluque requested a review from rayg1234 January 6, 2026 21:00
@lbluque lbluque requested review from kjmichel and mshuaibii January 8, 2026 18:43
def configure_batching(
self, max_batch_size: int = 32, batch_wait_timeout_s: float = 0.05
self,
max_batch_size: int = 32,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this supposed to still be 32?


@serve.batch
@serve.batch(
batch_size_fn=lambda batch: sum(sample.natoms.sum() for sample in batch).item()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

naive question - how is @serve.batch working here. how does batch_size_fn get incorporated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed enhancement New feature or request patch Patch version release

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants