Skip to content

Conversation

DingXiaoH
Copy link

Sometimes when the tensor format changes after this conv (e.g., NCHW -> NHWC for layer normalization), calling backward will raise an "input must be contiguous" error. Making the grad contiguous in advance simply solves it.

Sometimes when the tensor format changes after this conv (e.g., NCHW -> NHWC for layer normalization), calling backward will raise an "input must be contiguous" error. Making the grad contiguous in advance simply solves it.
@CLAassistant
Copy link

CLAassistant commented Apr 29, 2022

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants