-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Docs] Spanish Translation -Torchscript md & Trainer md #29310
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @njackman-2344. Thank you, the translation in both documents have a great quality. I only have some feedback.
docs/source/es/trainer.md
Outdated
* [~Trainer.get_train_dataloader] crea un entrenamiento de DataLoader | ||
* [~Trainer.get_eval_dataloader] crea una evaluación DataLoader | ||
* [~Trainer.get_test_dataloader] crea una prueba de DataLoader | ||
* [~Trainer.log] anota información de los objetos varios que observa el entrenamiento |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Add "anota la información" in line 116
- I think that you can translate
an optimizer y rate scheduler
in the sentence of line 117 - Chage “computa el perdido” to “computa la pérdida” in line 118.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copy that. I put tasa programada (rate scheduler) but I still don't feel that's sufficient. What do you think?
|
||
### Implicaciones | ||
|
||
Los modelos transformers basados en la arquitectura [BERT (Bidirectional Encoder Representations from Transformers)](https://huggingface.co/docs/transformers/main/model_doc/bert), o sus variantes como [distilBERT](https://huggingface.co/docs/transformers/main/model_doc/distilbert) y [roBERTa](https://huggingface.co/docs/transformers/main/model_doc/roberta), funcionan mejor en Inf1 para tareas no generativas como la respuesta a preguntas extractivas, la clasificación de secuencias y la clasificación de tokens. Sin embargo, las tareas de generación de texto aún pueden adaptarse para ejecutarse en Inf1 según este [tutorial de AWS Neuron MarianMT](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/src/examples/pytorch/transformers-marianmt.html). Se puede encontrar más información sobre los modelos que se pueden convertir fácilmente para usar en Inferentia en la sección de [Model Architecture Fit](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/neuron-guide/models/models-inferentia.html#models-inferentia) de la documentación de Neuron. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know why the links in this paragraph doesn't show its preview (the color blue). Also happen in line 22 of trainer.md
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice job!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @njackman-2344 LGTM👍. Just two little change in trainer.md
and its done:
-
In line 70. Remove
"con"
in "...o algo para preprocesar el conjunto de datoscon
(..." -
In line 187. Remove
"parametros"
at the end of "...Puedes cambiar el nivel de logging con los parametros log_level y log_level_replicaparametros
en [TrainingArguments]."
It'll close it, but I can reopen 😄 |
@@ -55,13 +55,17 @@ | |||
- local: custom_models | |||
title: Compartir modelos personalizados | |||
- local: run_scripts | |||
- local: trainer | |||
title: Entrenador |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are two title
's here which is causing the doc-builder to fail. I think we just want to keep Entrenador
and remove Entrenamiento con scripts
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will change it :) Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah sorry I wasn't clear, this should be as shown below because now - local: run_scripts
doesn't have a title
under it!
- local: run_scripts
title: Entrenamiento con scripts
- local: trainer
title: Entrenador
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
…29310) * torchscript and trainer md es translation * corrected md es files and even corrected spelling in en md * made es corrections to trainer.md * deleted entrenamiento... title on yml * placed entrenamiento in right place
* torchscript and trainer md es translation * corrected md es files and even corrected spelling in en md * made es corrections to trainer.md * deleted entrenamiento... title on yml * placed entrenamiento in right place
What does this PR do?
Fixes #28936 couple markdown files
Before submitting
Pull Request section?
to it if that's the case. [i18n-es] Translating docs to Spanish #28936
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu @gisturiz @aaronjimv
Whew!! I look forward to any changes needed. Thanks!! :)