Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make KNN feature normalization optional #1457

Merged
merged 3 commits into from
Dec 18, 2023

Conversation

guarin
Copy link
Contributor

@guarin guarin commented Dec 15, 2023

Changes

  • Make KNN feature normalization optional
  • Always reset KNN features at beginning of training epoch

How was it tested?

  • Added unit tests

Closes: #1425

Copy link

codecov bot commented Dec 15, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (610f73e) 85.34% compared to head (c0d09ee) 85.37%.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1457      +/-   ##
==========================================
+ Coverage   85.34%   85.37%   +0.02%     
==========================================
  Files         133      133              
  Lines        5563     5572       +9     
==========================================
+ Hits         4748     4757       +9     
  Misses        815      815              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@philippmwirth philippmwirth merged commit dc612ea into master Dec 18, 2023
10 checks passed
@philippmwirth philippmwirth deleted the add-knn-normalize-argument branch December 18, 2023 08:06
guarin added a commit that referenced this pull request Jan 19, 2024
* Make KNN feature normalization optional
* Reset features after training epoch
* Always convert features to correct dtype
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Clarification: Why does KNN eval normalize the representations?
2 participants