Skip to content

Operator combination of model structures #3056

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
zhihaofan opened this issue Jun 14, 2024 · 1 comment
Open

Operator combination of model structures #3056

zhihaofan opened this issue Jun 14, 2024 · 1 comment
Assignees

Comments

@zhihaofan
Copy link

zhihaofan commented Jun 14, 2024

Hello,
I am using AIMET for QAT, but when I use fold_all_batch_norms, I find that there is a lot of loss before and after the fold.
I tried using . /Examples/torch/quantization/qat.ipynb when I tried it, I found that fold_all_batch_norms also had differences, but the results were acceptable.

Both models are constructed with connv , BatchNorm, ReLU, and residual networks. I would like to ask if there is a guide to use the matching of the operators. Because I suspect that there might be a problem with the pairing somewhere that is causing the problem.

@quic-bhushans
Copy link
Contributor

@zhihaofan could you please share an example that you have to repro this issue?
also, would be great if you can try with our latest aimet-torch https://pypi.org/project/aimet-torch/

@quic-bhushans quic-bhushans self-assigned this Apr 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants