Batched inputs to genai python api #555
Labels
component:support
How to do xyz?
status:triaged
Issue/PR triaged to the corresponding sub-team
type:feature request
New feature request/enhancement
Description of the feature request:
Is it possible to have batched inputs with the genai python api? Currently from the documentation I see that only Vertex AI supports batched prediction for Gemini
It seems like a crucial feature to have for widespread gemini adoption to allow batched inputs directly from this genai python api. Openai python api supports it, and it is quite convenient. Would it be possible to add this feature?
What problem are you trying to solve with this feature?
This would be very useful in cases we want to have large batches of predictions with gemini directly from python, without having to go through vertex ai. Would be overall a convenient feature to have.
The text was updated successfully, but these errors were encountered: