A command-line tool for automatically generating missing translations to ARB files using Google Gemini or OpenAI ChatGPT by LeanCode
$ dart pub global activate arb_translate
arb_translate
has been designed to seamlessly integrate with Flutter apps
using flutter_localizations
for code generation from ARB files. Thanks to this
integration, the setup process for arb_translate can be completed in just a few
steps:
-
Generate your API key. You can create your Gemini key here or your OpenAI key here
-
Save your API token in the environment variable
ARB_TRANSLATE_API_KEY
or addarb-translate-api-key: {your-api-key}
tol10n.yaml
in your project. -
If you are using ChatGPT select OpenAI as model provider. To do it add
arb-translate-model-provider: open-ai
tol10n.yaml
or use command argument--model-provider: open-ai
-
(Optional) Select model used for translation. To do it add
arb-translate-model
tol10n.yaml
or use command argument--model
. The available options are[gemini-1.0-pro (default for Gemini), gemini-1.5-pro, gemini-1.5-flash, gpt-3.5-turbo (default for OpenAI), gpt-4, gpt-4-turbo, gpt-4o]
-
(Optional) Add context of your application
arb-translate-context: {your-app-context}
eg. "sporting goods store app"
All other required parameters match flutter_localizations
parameters and will
be read from l10n.yaml
file. You can override them using command arguments if
necessary. See arb_translate --help
for more information.
If you project doesn't include l10n.yaml
configuration you have to provide
configuration using environment variables and command arguments. You also have
to provide:
-
--arb-dir
The directory where the template and translated ARB files are located -
--template-arb-file
The template ARB file that will be used as the basis for translation
See arb_translate --help
for more information.
You can use arb_translate
with Vertex AI service from Google Cloud Platform
but configuration is a bit longer:
- Create your GCP project and enable Vertex AI by following https://cloud.google.com/vertex-ai/docs/generative-ai/start/quickstarts/api-quickstart
- Generate your API token using gcloud CLI
$ gcloud auth print-access-token
- Save your API token in the environment variable
ARB_TRANSLATE_API_KEY
or addarb-translate-api-key: {your-api-key}
tol10n.yaml
or specify as command argument--api-key {your-api-key}
- Add
arb-translate-model-provider: vertex-ai
tol10n.yaml
or specify as command argument--model-provider: vertex-ai
- Add
arb-translate-vertex-ai-project-url: {your-project-url}
tol10n.yaml
or specify as command argument--vertex-ai-project-url {your-project-url}
. Project url should look like thishttps://{region}-aiplatform.googleapis.com/v1/projects/{your-project-id}/locations/{region}/publishers/google/models
You can use arb_translate
with any model with an OpenAI-compatible API. To configure a custom model:
- Add
arb-translate-model-provider: custom
tol10n.yaml
or specify as command argument--model-provider: custom
- Add
arb-translate-custom-model: {your-model-name}
tol10n.yaml
or specify as command argumentcustom-model: {your-model-name}
- Add
arb-translate-custom-model-provider-base-url: {your-model-url}
tol10n.yaml
or specify as command argument:--custom-model-provider-base-url: {your-model-url}
- (Optional) Set target batch size appropriately to model token count limits by
adding
arb-translate-batch-size: {size}
to yourl10n.yaml
or specify as command argumentbatch-size: {size}
. Batch size is the number of characters of ARB messages in a single batch and does not include the prompt or app context.
To generate translations, simply call arb_translate. All messages included in the template ARB file but missing from other files will be translated. To add a new locale, simply add an empty ARB file.
$ arb_translate
Or without l10n.yaml
file
$ arb_translate --arb-dir...
If you want to know how we made this tool and what challenges we had, read the story.
Built with ☕️ by LeanCode