Skip to content

Conversation

@PranavBhatP
Copy link
Contributor

@PranavBhatP PranavBhatP commented Oct 12, 2025

Reference Issues/PRs

fixes #1979

What does this implement/fix? Explain your changes.

Implements feature scaling for continuous targets and features in the D2 layer using scalers and target_normalizer params.

What should a reviewer concentrate their feedback on?

PR checklist

  • The PR title starts with either [ENH], [MNT], [DOC], or [BUG]. [BUG] - bugfix, [MNT] - CI, test framework, [ENH] - adding or improving code, [DOC] - writing or improving documentation or docstrings.
  • Added/modified tests
  • Used pre-commit hooks when committing to ensure that code is compliant with hooks. Install hooks with pre-commit install.
    To run hooks independent of commit, execute pre-commit run --all-files

@codecov
Copy link

codecov bot commented Oct 16, 2025

Codecov Report

❌ Patch coverage is 50.64935% with 38 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (main@d4596de). Learn more about missing BASE report.

Files with missing lines Patch % Lines
pytorch_forecasting/data/data_module.py 50.64% 38 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1983   +/-   ##
=======================================
  Coverage        ?   86.67%           
=======================================
  Files           ?      160           
  Lines           ?     9540           
  Branches        ?        0           
=======================================
  Hits            ?     8269           
  Misses          ?     1271           
  Partials        ?        0           
Flag Coverage Δ
cpu 86.67% <50.64%> (?)
pytest 86.67% <50.64%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@PranavBhatP PranavBhatP marked this pull request as ready for review October 17, 2025 11:08
@PranavBhatP
Copy link
Contributor Author

Tests are failing on the notebook. Seems like its an issue caused by the training script in the notebook. Will try changing accelator = "cpu". Might resolve the issue.

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

requires_grad = feature_data.requires_grad
device = feature_data.device
feature_data_np = (
feature_data.cpu().detach().numpy().reshape(-1, 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have doubt: Wouldn't using detach again detach the tensor from the computation graph? That would again lead to the same issue?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as my knowledge of pytorch goes, I think it's a good practice to use .detach() before converting the pytorch tensor to a numpy array. Anyways, the numpy array will not track the gradients, so this won't matter.

@phoeenniixx
Copy link
Member

I think we could look at v1 implementation? Maybe that could help us with this issue, how the sklearn scalers are handled there?

@PranavBhatP
Copy link
Contributor Author

PranavBhatP commented Oct 21, 2025

Yes we will need a design doc to analyse v1 implementation. This is the best way forward.

@fkiraly
Copy link
Collaborator

fkiraly commented Oct 22, 2025

agree - it seems like one more rabbit hole...

Is there perhaps something we can learn from pytorch-tabular?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[ENH] Add Support for Feature Scaling in D2 DataModule Layer.

3 participants