Skip to content

[Docs] Spanish Translation -Torchscript md & Trainer md#29310

Merged
stevhliu merged 5 commits intohuggingface:mainfrom
njackman-2344:translate-md-es
Mar 4, 2024
Merged

[Docs] Spanish Translation -Torchscript md & Trainer md#29310
stevhliu merged 5 commits intohuggingface:mainfrom
njackman-2344:translate-md-es

Conversation

@njackman-2344
Copy link
Contributor

@njackman-2344 njackman-2344 commented Feb 27, 2024

What does this PR do?

Fixes #28936 couple markdown files

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@stevhliu @gisturiz @aaronjimv

Whew!! I look forward to any changes needed. Thanks!! :)

Copy link
Contributor

@aaronjimv aaronjimv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @njackman-2344. Thank you, the translation in both documents have a great quality. I only have some feedback.

* [~Trainer.get_train_dataloader] crea un entrenamiento de DataLoader
* [~Trainer.get_eval_dataloader] crea una evaluación DataLoader
* [~Trainer.get_test_dataloader] crea una prueba de DataLoader
* [~Trainer.log] anota información de los objetos varios que observa el entrenamiento
Copy link
Contributor

@aaronjimv aaronjimv Feb 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Add "anota la información" in line 116
  • I think that you can translate an optimizer y rate scheduler in the sentence of line 117
  • Chage “computa el perdido” to “computa la pérdida” in line 118.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy that. I put tasa programada (rate scheduler) but I still don't feel that's sufficient. What do you think?


### Implicaciones

Los modelos transformers basados en la arquitectura [BERT (Bidirectional Encoder Representations from Transformers)](https://huggingface.co/docs/transformers/main/model_doc/bert), o sus variantes como [distilBERT](https://huggingface.co/docs/transformers/main/model_doc/distilbert) y [roBERTa](https://huggingface.co/docs/transformers/main/model_doc/roberta), funcionan mejor en Inf1 para tareas no generativas como la respuesta a preguntas extractivas, la clasificación de secuencias y la clasificación de tokens. Sin embargo, las tareas de generación de texto aún pueden adaptarse para ejecutarse en Inf1 según este [tutorial de AWS Neuron MarianMT](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/src/examples/pytorch/transformers-marianmt.html). Se puede encontrar más información sobre los modelos que se pueden convertir fácilmente para usar en Inferentia en la sección de [Model Architecture Fit](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/neuron-guide/models/models-inferentia.html#models-inferentia) de la documentación de Neuron.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know why the links in this paragraph doesn't show its preview (the color blue). Also happen in line 22 of trainer.md

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice job!

Copy link
Contributor

@aaronjimv aaronjimv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @njackman-2344 LGTM👍. Just two little change in trainer.md and its done:

  • In line 70. Remove "con" in "...o algo para preprocesar el conjunto de datos con (..."

  • In line 187. Remove "parametros" at the end of "...Puedes cambiar el nivel de logging con los parametros log_level y log_level_replica parametros en [TrainingArguments]."

@aaronjimv
Copy link
Contributor

Hello @stevhliu, I have a question: this PR will close #28936 or will it stay open?

@stevhliu
Copy link
Member

stevhliu commented Mar 4, 2024

this PR will close #28936 or will it stay open?

It'll close it, but I can reopen 😄

title: Compartir modelos personalizados
- local: run_scripts
- local: trainer
title: Entrenador
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are two title's here which is causing the doc-builder to fail. I think we just want to keep Entrenador and remove Entrenamiento con scripts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will change it :) Thanks!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah sorry I wasn't clear, this should be as shown below because now - local: run_scripts doesn't have a title under it!

- local: run_scripts
   title: Entrenamiento con scripts
- local: trainer
   title: Entrenador

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@stevhliu stevhliu merged commit e947683 into huggingface:main Mar 4, 2024
@njackman-2344 njackman-2344 deleted the translate-md-es branch March 4, 2024 22:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[i18n-es] Translating docs to Spanish

4 participants