We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我目前在研究反射去除问题,其中同样涉及到了normalization层,我发现在conv和ReLU之间加入BN层会较明显的降低性能(batch_size=1),所以目前都是不加入任何normalization层
我看到您在之前的issue中提到图像修复任务中普遍认为归一化会影响结果。请问应该怎么解释这个现象呢,主要原因是batch_size过小吗? 凭借您的经验,如果加入GN层,是否会比不加任何归一化层效果更好呢? @MaybeShewill-CV
The text was updated successfully, but these errors were encountered:
@Carlyx batch size很小的情况下使用norm很影响模型的结果。使用gn替换bn效果没什么改善:)
Sorry, something went wrong.
好的,感谢回复~
No branches or pull requests
您好,我目前在研究反射去除问题,其中同样涉及到了normalization层,我发现在conv和ReLU之间加入BN层会较明显的降低性能(batch_size=1),所以目前都是不加入任何normalization层
我看到您在之前的issue中提到图像修复任务中普遍认为归一化会影响结果。请问应该怎么解释这个现象呢,主要原因是batch_size过小吗? 凭借您的经验,如果加入GN层,是否会比不加任何归一化层效果更好呢?
@MaybeShewill-CV
The text was updated successfully, but these errors were encountered: