Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ask: Are there any parameters that can be used to speed up translation? And how does it work? #436

Open
JinTTTT opened this issue Dec 2, 2024 · 7 comments

Comments

@JinTTTT
Copy link

JinTTTT commented Dec 2, 2024

my example: A 600KB .epub file takes 30 minutes to translate using GPT3.5

@yihong0618
Copy link
Owner

sorry we do not use concurrency at first, only you can do is choose a faster model like groq

@mkXultra
Copy link
Collaborator

mkXultra commented Dec 3, 2024

Perhaps using -batch might make it faster.

parser.add_argument(
"--batch",
dest="batch_flag",
action="store_true",
help="Enable batch translation using ChatGPT's batch API for improved efficiency",
)
parser.add_argument(
"--batch-use",
dest="batch_use_flag",
action="store_true",
help="Use pre-generated batch translations to create files. Run with --batch first before using this option",
)

Because it is a batch run, there is no rate limit. However, according to openai, the batch execution will be completed within 24 hours.

@mkXultra
Copy link
Collaborator

mkXultra commented Dec 3, 2024

I also recommend the gpt4o-mini model.
It is faster, cheaper and better quality.

@uyiewnil
Copy link

uyiewnil commented Dec 4, 2024

I also recommend the gpt4o-mini model. It is faster, cheaper and better quality.

@mkXultra How can I use the 4o-mini model? I've tried a few methods, but none of them seem to work.

@mkXultra
Copy link
Collaborator

mkXultra commented Dec 4, 2024

@uyiewnil
like this

python3 make_book.py --book_name ./Clarimonde.epub \
                                               --openai_key {api_key}  \
                                               --translate-tags h1,h2,h3,div,p \
                                               --model gpt4omini --language ja

if you want to use batch

 python3 make_book.py --book_name ./Clarimonde.epub \
                                                   --openai_key {api_key}  \
                                                   --translate-tags h1,h2,h3,div,p \
                                                   --language ja \
                                                   --model gpt4omini \
                                                   --batch
// few min later
 python3 make_book.py --book_name ./Clarimonde.epub \
                                                   --openai_key {api_key}  \
                                                   --translate-tags h1,h2,h3,div,p \
                                                   --language ja \
                                                   --model gpt4omini \
                                                   --batch-use

@JinTTTT
Copy link
Author

JinTTTT commented Dec 4, 2024

@mkXultra
I tried using --batch, and afterward --batch-use, but it didn't work on my side, just keep telling me not cemplete yet

@mkXultra
Copy link
Collaborator

mkXultra commented Dec 4, 2024

hmm, You can check the status of it in the section called batches in the open api.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants
@uyiewnil @yihong0618 @mkXultra @JinTTTT and others