-
Notifications
You must be signed in to change notification settings - Fork 944
v5 #3177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
v5 #3177
Conversation
merveenoyan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know it's a draft, just left some suggestions for clarity 🤗
merveenoyan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more comments!
|
also smol reminder to add to _blog.yml 🙌🏻 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really nice blog post. My suggestions make the prose a bit punchier with simpler grammar.
Also, remember to add it to _blog.yml 🤗 .
pcuenca
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First quick pass, looking great!
As a general comment, in the middle sections there are many paragraphs that start with "We have", "We are", which may feel a bit monotonous. We (lol) could slightly reword a few of these.
transformers-v5.md
Outdated
|
|
||
| This growth is linked to the growth of the field and the now mainstream access of AI, no doubt; as a leading model-definition library in the ecosystem, we need to continuously evolve and adapt the library to continue being relevant. Reinvention is key for longevity in AI. | ||
|
|
||
| We’re lucky to be working with a great number of libraries and apps working with transformers, in no specific order: llama.cpp, MLX, onnxruntime, Jan, LMStudio, vLLM, SGLang, Unsloth, LlamaFactory , dLLM, MaxText, TensorRT, Argmax, among many other friends. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the segue from fransformers being successful to talk about the libraries we work with might not be immediately apparent to readers that may not be following our previous comms. We could be explicit that one of the ingredients for success is our (aspirational) role to support and act as a reference for others. We could perhaps link to the previous post as well: https://huggingface.co/blog/transformers-model-definition.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Feeling overwhelmed? Try mantra meditation.
Repeating a simple word or phrase—like “peace” or “Om”—can calm your mind, improve focus, and bring you back to the present moment. Just a few minutes a day can make a big difference. ✨🧘♀️
#MantraMeditation #Mindfulness #Wellness #InnerPeace
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks this is very relevant information
transformers-v5.md
Outdated
|
|
||
| Transformers, at the core, remains a model architecture toolkit: we aim to have all recent architectures, and to act as the “source of truth” of such model definitions in the ecosystem. We’ve been adding between 1 and 3 new architectures to the toolkit every week over the past 5 years, as can be seen in the timeline below: | ||
|
|
||
| <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions"> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super nice! Will it render too small?
Co-authored-by: burtenshaw <[email protected]> Co-authored-by: Merve Noyan <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Alvaro Bartolome <[email protected]>
Co-authored-by: burtenshaw <[email protected]> Co-authored-by: Merve Noyan <[email protected]>
Co-authored-by: Merve Noyan <[email protected]> Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Alvaro Bartolome <[email protected]>
transformers-v5.md
Outdated
|
|
||
| <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions"> | ||
|
|
||
| [*https://huggingface.co/spaces/yonigozlan/Transformers-Timeline*](https://huggingface.co/spaces/yonigozlan/Transformers-Timeline) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not embed the Space?
|
need turkey branding/logo for this? |
| we're also working with partners in the Jax ecosystem to ensure we have compatibility between our models and this | ||
| ecosystem. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, very nice
|
|
||
| We took it as an opportunity to clean up the toolkit and isolate what mattered; we now have a clean slate on top of which to build. Thanks to the many changes from the community and team, improvements in performance, usability, and readability, will be simpler to ship. | ||
|
|
||
| Now that v5.0.0's first RC is out there, we'll be eagerly awaiting your [feedback](https://github.com/huggingface/transformers/issues/40822). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps we want to slightly update the description in that issue after we release this post, to focus on feedback rather than the process you described before.
Co-authored-by: Pedro Cuenca <[email protected]>
Co-authored-by: Pedro Cuenca <[email protected]>
ebezzam
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small typos in the quotes, and rewording in some parts. Awesome 🚀
|
|
||
| The overarching theme of this version 5 release is “interoperability”. All refactors, performance improvements, and standardization are aligned with this theme. v5 plays nicely and end-to-end with the growing ecosystem: train a model with Unsloth/Axolotl/LlamaFactory/MaxText deploy it with vLLM/SGLang, and export it to llama.cpp/executorch/MLX to run locally\! | ||
|
|
||
| Version 5 is undeniably an accomplishment of the past five years by a very large number of people in our community. We also see it as a promise, and as a beacon of the direction we want to go. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🎉
Co-authored-by: burtenshaw <[email protected]>
🙌 Transformers v5 🙌