Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

归一化层 #70

Closed
Carlyx opened this issue May 8, 2020 · 2 comments
Closed

归一化层 #70

Carlyx opened this issue May 8, 2020 · 2 comments

Comments

@Carlyx
Copy link

Carlyx commented May 8, 2020

您好,我目前在研究反射去除问题,其中同样涉及到了normalization层,我发现在conv和ReLU之间加入BN层会较明显的降低性能(batch_size=1),所以目前都是不加入任何normalization层

我看到您在之前的issue中提到图像修复任务中普遍认为归一化会影响结果。请问应该怎么解释这个现象呢,主要原因是batch_size过小吗? 凭借您的经验,如果加入GN层,是否会比不加任何归一化层效果更好呢?
@MaybeShewill-CV

@MaybeShewill-CV
Copy link
Owner

@Carlyx batch size很小的情况下使用norm很影响模型的结果。使用gn替换bn效果没什么改善:)

@Carlyx
Copy link
Author

Carlyx commented May 8, 2020

好的,感谢回复~

@Carlyx Carlyx closed this as completed May 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants