Skip to content

[Not for merge] Support Zipformer encoder for LLM based ASR #1944

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 24 commits into
base: master
Choose a base branch
from

Conversation

yfyeung
Copy link
Collaborator

@yfyeung yfyeung commented May 15, 2025

This PR add zipformer_llm_zh recipe.

Some new features / modifications:

  • Introduce batch shaving mechanism, dynamically reduce batch when OOM happen to prevent training interruptions.
  • Transition to full DeepSpeed training, removing torch.autocast (supported by Fix scaling.py: ensure SwooshL/SwooshR output dtype matches input dtype #1940).
  • Expose more parameters of DynamicBucketSampler for more efficient batching.
  • Fix data preparation from huggingface.
  • Set world_size and rank explicitly for dataloader.

@yfyeung yfyeung changed the title [WIP] Add Streaming Zipformer LLM recipe for ASR [WIP] Add streaming Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Add streaming Zipformer encoder for LLM based ASR [WIP] Support streaming Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Support streaming Zipformer encoder for LLM based ASR [WIP] Support Zipformer encoder for LLM based ASR May 15, 2025
@yfyeung yfyeung changed the title [WIP] Support Zipformer encoder for LLM based ASR [Not for merge] Support Zipformer encoder for LLM based ASR Jun 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant