Skip to content

Conversation

@fonnesbeck
Copy link
Member

@fonnesbeck fonnesbeck commented Nov 22, 2025

Description

PSD method and test added. A simple HSGP fits successfully with the resulting RatQuad kernel.

Related Issue

  • Closes #
  • Related to #

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

@fonnesbeck fonnesbeck requested a review from bwengals November 22, 2025 03:07
@codecov
Copy link

codecov bot commented Nov 22, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.49%. Comparing base (f69e159) to head (46343e2).

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##             main    #7973   +/-   ##
=======================================
  Coverage   91.49%   91.49%           
=======================================
  Files         116      116           
  Lines       18962    18971    +9     
=======================================
+ Hits        17349    17358    +9     
  Misses       1613     1613           
Files with missing lines Coverage Δ
pymc/gp/cov.py 98.68% <100.00%> (+0.02%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@fonnesbeck
Copy link
Member Author

BTW, one limitation here is that pytensor.kv does not appear to support gradient calculation so NUTS can't be fully supported. There is an alternative parameterization that may not need Bessel functions, unless said support is coming.

@jessegrabowski
Copy link
Member

BTW, one limitation here is that pytensor.kv does not appear to support gradient calculation so NUTS can't be fully supported. There is an alternative parameterization that may not need Bessel functions, unless said support is coming.

Does jax or tfp have the gradients we could use as the basis for an implementation?

@fonnesbeck
Copy link
Member Author

fonnesbeck commented Nov 24, 2025

Apparently not.

image

Looks like a numerical approximation (via finite differences) is as close as you can get.

@jessegrabowski
Copy link
Member

@ricardoV94
Copy link
Member

ricardoV94 commented Nov 24, 2025

In the past I've implemented a couple gradients using stan math / wolfram as reference. Never had a case where tfp was ahead of the curve for these special functions

@ricardoV94
Copy link
Member

ricardoV94 commented Nov 24, 2025

Gradient wrt second input is the trivial one in these functions, we also have it

@fonnesbeck
Copy link
Member Author

There has been some useful discussion on Jax: jax-ml/jax#12402

@fonnesbeck
Copy link
Member Author

FWIW, it does work without a fully differentiable function -- it just uses a slice sampler for alpha.

ratquad_hsgp.html

npt.assert_allclose(true_1d_psd, test_1d_psd, atol=1e-5)

def test_psd_eigenvalues(self):
# Test PSD implementation using Szegő’s Theorem
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this work for a 2 or higher dimensional X?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants