-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[docs] Spanish translation of attention.md #29681
Conversation
cc: @tadeodonegana. @gisturiz Hello guys, I would appreciate if you can review the translation. Thank you. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, it LGTM from my end! We can merge once the content has been reviewed 🤗
docs/source/es/attention.md
Outdated
|
||
# Mecanismos de atención | ||
|
||
La mayoría de los modelos de transformadores utilizan atención completa, en el sentido de que la matriz de atención es cuadrada. Esto puede ser un gran cuello de botella computacional cuando tienes textos largos. `Longformer` y `reformer` son modelos que intentan ser más eficientes y utilizan una versión dispersa de la matriz de atención para acelerar el entrenamiento. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When writing about transformers (specifically in the phrase "modelos de transformadores"), I believe that using 'transformadores' may not be the most accurate translation. In my opinion, it is preferable to consistently refer to them as "transformers," regardless of the language being used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aaronjimv @stevhliu I just added this comment. I'm not sure if it's correct, but I wanted to express my thoughts on this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hi @tadeodonegana thanks, I thinks its a good point. I going to use to "transformers
"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think when referring to the library, it is good to refer to it as just Transformers (no translation), but when referring to the models in a more generic way then it is ok to translate it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good to me! Happy to help reviewing future translations! Great work @aaronjimv!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
* add attention to es/ and edit es/_toctree.yml * translate attention.md * fix transformers * fix transformers
What does this PR do?
Add the Spanish version of
attention.md
totransformers/docs/source/es
#28936
Fixes # (issue)
Before submitting
Pull Request section?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu