Skip to content

Conversation

@LysandreJik
Copy link
Member

@LysandreJik LysandreJik commented Nov 19, 2025

🙌 Transformers v5 🙌

Copy link
Contributor

@merveenoyan merveenoyan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know it's a draft, just left some suggestions for clarity 🤗

Copy link
Contributor

@merveenoyan merveenoyan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

more comments!

@merveenoyan
Copy link
Contributor

merveenoyan commented Nov 20, 2025

also smol reminder to add to _blog.yml 🙌🏻

Copy link
Collaborator

@burtenshaw burtenshaw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really nice blog post. My suggestions make the prose a bit punchier with simpler grammar.

Also, remember to add it to _blog.yml 🤗 .

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First quick pass, looking great!

As a general comment, in the middle sections there are many paragraphs that start with "We have", "We are", which may feel a bit monotonous. We (lol) could slightly reword a few of these.


This growth is linked to the growth of the field and the now mainstream access of AI, no doubt; as a leading model-definition library in the ecosystem, we need to continuously evolve and adapt the library to continue being relevant. Reinvention is key for longevity in AI.

We’re lucky to be working with a great number of libraries and apps working with transformers, in no specific order: llama.cpp, MLX, onnxruntime, Jan, LMStudio, vLLM, SGLang, Unsloth, LlamaFactory , dLLM, MaxText, TensorRT, Argmax, among many other friends.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the segue from fransformers being successful to talk about the libraries we work with might not be immediately apparent to readers that may not be following our previous comms. We could be explicit that one of the ingredients for success is our (aspirational) role to support and act as a reference for others. We could perhaps link to the previous post as well: https://huggingface.co/blog/transformers-model-definition.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feeling overwhelmed? Try mantra meditation.

Repeating a simple word or phrase—like “peace” or “Om”—can calm your mind, improve focus, and bring you back to the present moment. Just a few minutes a day can make a big difference. ✨🧘‍♀️

#MantraMeditation #Mindfulness #Wellness #InnerPeace

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks this is very relevant information


Transformers, at the core, remains a model architecture toolkit: we aim to have all recent architectures, and to act as the “source of truth” of such model definitions in the ecosystem. We’ve been adding between 1 and 3 new architectures to the toolkit every week over the past 5 years, as can be seen in the timeline below:

<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions">
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super nice! Will it render too small?

LysandreJik and others added 6 commits November 25, 2025 12:13
Co-authored-by: burtenshaw <[email protected]>
Co-authored-by: Merve Noyan <[email protected]>
Co-authored-by: Pedro Cuenca <[email protected]>
Co-authored-by: Alvaro Bartolome <[email protected]>
Co-authored-by: burtenshaw <[email protected]>
Co-authored-by: Merve Noyan <[email protected]>
Co-authored-by: Merve Noyan <[email protected]>
Co-authored-by: Pedro Cuenca <[email protected]>
Co-authored-by: Alvaro Bartolome <[email protected]>

<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers_v5/transformers_model_timeline.png" alt="Transformers standardizing model definitions">

[*https://huggingface.co/spaces/yonigozlan/Transformers-Timeline*](https://huggingface.co/spaces/yonigozlan/Transformers-Timeline)
Copy link
Contributor

@merveenoyan merveenoyan Nov 26, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not embed the Space?

@julien-c
Copy link
Member

need turkey branding/logo for this?

Comment on lines +78 to +79
we're also working with partners in the Jax ecosystem to ensure we have compatibility between our models and this
ecosystem.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, very nice


We took it as an opportunity to clean up the toolkit and isolate what mattered; we now have a clean slate on top of which to build. Thanks to the many changes from the community and team, improvements in performance, usability, and readability, will be simpler to ship.

Now that v5.0.0's first RC is out there, we'll be eagerly awaiting your [feedback](https://github.com/huggingface/transformers/issues/40822).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we want to slightly update the description in that issue after we release this post, to focus on feedback rather than the process you described before.

@LysandreJik LysandreJik marked this pull request as ready for review November 27, 2025 16:35
Copy link
Contributor

@ebezzam ebezzam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small typos in the quotes, and rewording in some parts. Awesome 🚀


The overarching theme of this version 5 release is “interoperability”. All refactors, performance improvements, and standardization are aligned with this theme. v5 plays nicely and end-to-end with the growing ecosystem: train a model with Unsloth/Axolotl/LlamaFactory/MaxText deploy it with vLLM/SGLang, and export it to llama.cpp/executorch/MLX to run locally\!

Version 5 is undeniably an accomplishment of the past five years by a very large number of people in our community. We also see it as a promise, and as a beacon of the direction we want to go.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants