Skip to content

🌐 [i18n-KO] Translated main_classes/peft.md #39515

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

luckyvickyricky
Copy link
Contributor

@luckyvickyricky luckyvickyricky commented Jul 19, 2025

What does this PR do?

Translated the main_classes/peft.md file of the documentation to Korean.
Thank you in advance for your review.

Part of #20179

Before reviewing

  • Check for missing / redundant translations (번역 누락/중복 검사)
  • Grammar Check (맞춤법 검사)
  • Review or Add new terms to glossary (용어 확인 및 추가)
  • Check Inline TOC (e.g. [[lowercased-header]])
  • Check live-preview for gotchas (live-preview로 정상작동 확인)

Who can review? (Initial)

May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review? (Final)

Added a table of contents (TOC) to the documentation, specifically for the `transformers.integrations.PeftAdapterMixin` section, following the structure and content outlined in [this link](https://huggingface.co/docs/transformers/main/en/main_classes/peft#transformers.integrations.PeftAdapterMixin).
Changed '관리하기 위한' to '관리할 수 있도록' for more natural Korean expression when describing the purpose of providing functions.
Changed '~할 수 없기 때문에' to '~할 수 없어' for more concise expression while maintaining clarity.
Changed '주입할 수 없어' to '적용할 수 없어' for better readability.
Considered alternatives:

'삽입': Too literal translation of 'inject'
'입력': Could be misunderstood as data input
'통합': Implies merging two systems
'추가': Simple but less precise

'적용' was chosen as it's the most natural and widely used term in Korean technical documentation for this context.
@4N3MONE
Copy link
Contributor

4N3MONE commented Jul 19, 2025

LGTM! 번역 고생하셨습니다 👍

@Rocketknight1
Copy link
Member

cc @stevhliu

Copy link
Contributor

@harheem harheem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_toctree.yml 확인 부탁드려요~
클래스 설명에 대해 수정 제안 올려보았는데, 같이 고민해보면 좋을거같습니다. 🤗

@@ -278,6 +278,8 @@
title: (번역중) Optimization
- local: in_translation
title: 모델 출력
- local: main_classes/PEFT
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- local: main_classes/PEFT
- local: main_classes/peft

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

peft가 맞는거같습니다~

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

peft 문서가 model output 아래에 들어가야하는거같은데 현재 문서 toctree 위치 확인부탁드립니다!


# PEFT[[transformers.integrations.PeftAdapterMixin]]

[`~integrations.PeftAdapterMixin`]은 Transformers와 함께 어댑터를 관리할 수 있도록 [PEFT](https://huggingface.co/docs/peft/index) 라이브러리의 함수들을 제공합니다. 이 믹스인은 현재 LoRA, IA3, AdaLora를 지원합니다. 프리픽스 튜닝 방법들(프롬프트 튜닝, 프롬프트 학습)은 torch 모듈에 삽입할 수 없어 지원되지 않습니다.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
[`~integrations.PeftAdapterMixin`]Transformers와 함께 어댑터를 관리할 수 있도록 [PEFT](https://huggingface.co/docs/peft/index) 라이브러리의 함수들을 제공합니다. 이 믹스인은 현재 LoRA, IA3, AdaLora를 지원합니다. 프리픽스 튜닝 방법들(프롬프트 튜닝, 프롬프트 학습)은 torch 모듈에 삽입할없어 지원되지 않습니다.
[`~integrations.PeftAdapterMixin`][PEFT](https://huggingface.co/docs/peft/index) 라이브러리의 함수를 제공하여 트랜스포머 모델의 어댑터를 관리합니다. 이 믹스인은 현재 LoRA, IA3, AdaLora를 지원합니다. 프리픽스 튜닝 방법들(프롬프트 튜닝, 프롬프트 학습)은 torch 모듈에 삽삽입할없는 구조이므로 지원되지 않습니다.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

와 이부분 번역 너무 어렵네요! 원문에서는 'with Transformers' 라고 작성했기에 트랜스포머 라이브러리를 가리키고 있으나, 원본 문서의 클래스 설명을 자세히 읽어보니 트랜스포머 모델 어댑터 관리에 대한 것이라 이렇게 번역을 해보았습니다. 🤔 같이 고민해봐도 좋을 것 같아요.

https://huggingface.co/docs/transformers/main_classes/peft#transformers.integrations.PeftAdapterMixin

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, we can merge once the feedback has been addressed!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants