Skip to content

Round the epilogue offset to positive values in cunn_SoftMaxForward #2210

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

xinyazhang
Copy link

This fixes OOB memory access for followng code

import torch
qk = torch.randn((9,1017), dtype=torch.float64, device='cuda')
smqk = torch.softmax(qk, dim=-1)

Correctness can be confirmed with:

import torch
import numpy as np
from scipy.special import softmax

qk = torch.randn((9,1017), dtype=torch.float64, device='cuda')
nqk = qk.cpu().numpy()
smqk = torch.softmax(qk, dim=-1)
nsmqk = smqk.cpu().numpy()
smnqk = softmax(nqk, axis=-1)

print(f'{np.allclose(smnqk, nsmqk)}')

This is ported from upstream PR pytorch#154634

@pruthvistony
Copy link
Collaborator

@xinyazhang ,
Can this PR be closed?

@xinyazhang xinyazhang closed this Jun 6, 2025
@xinyazhang
Copy link
Author

Duplicated with #2247

@xinyazhang xinyazhang deleted the xinyazhang/fixsoftmax-size_9_1017 branch June 6, 2025 12:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants