Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DisentangledSelfAttention implementation incosistent with paper #3

Open
zhujiem opened this issue Feb 18, 2022 · 0 comments
Open

DisentangledSelfAttention implementation incosistent with paper #3

zhujiem opened this issue Feb 18, 2022 · 0 comments

Comments

@zhujiem
Copy link

zhujiem commented Feb 18, 2022

Hi, we plan to integreate your model to FuxiCTR, a library for CTR prediction.

But we found that the layer DisentangledSelfAttention used in DESTINE.py seems not consist with your paper description about unary attention weights.

  • The paper uses the multiplication of mean_query and key to get unary attention weights.
  • The code use the key itself to map the dimension to num_heads and then use a softmax to get unary attention weights. See code.

Is the version wrong? Or could you suggest which version to implement in FuxiCTR, according to your code or paper? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant