Conversation
…ant to TransformerNUFFTPyNUFFT Replaces the pynufft-backed TransformerNUFFT with a JAX-native nufftax implementation that matches TransformerDFT to ~1e-13 relative across odd/even/non-square sizes (vs pynufft's ~6% gridding error on production 256x256 cases) and supports jit/grad/vmap. The original pynufft class is preserved as TransformerNUFFTPyNUFFT for backwards compatibility. apply_sparse_operator() raises NotImplementedError when called with the new TransformerNUFFT; the sparse-operator path depends on pynufft's kernel-deconvolved adjoint scale, so users must opt back to TransformerDFT or TransformerNUFFTPyNUFFT for that path. Co-Authored-By: Claude Opus 4.7 (1M context) <[email protected]>
This was referenced May 10, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
TransformerNUFFTwith a JAX-native nufftax-based implementation as the default. Preserve the original pynufft class asTransformerNUFFTPyNUFFTfor backwards compatibility.TransformerDFTto ~1e-13 relative across 13 stress-test configurations (odd/even/non-square sizes, varying pixel scales, sparse/dense uv, low/high frequencies). Pynufft sits at ~6% relative gridding error on production 256×256 cases.jax.jit,jax.grad,jax.vmap— unblocks end-to-end JAX-jitted interferometer likelihoods. JIT'd forward NUFFT measures ~20% faster than pynufft (6.1 ms vs 7.7 ms on 256² / 1000 vis).API Changes
aa.TransformerNUFFTis now the JAX-native nufftax variant (default).aa.TransformerNUFFTPyNUFFTexposes the legacy pynufft variant unchanged.Interferometer.apply_sparse_operator()raisesNotImplementedErrorwhen the dataset's transformer is the newTransformerNUFFT. The sparse-operator path depends on pynufft's kernel-deconvolved adjoint scale; users can opt back viatransformer_class=aa.TransformerNUFFTPyNUFFT, or useTransformerDFT(which is what the JAX-likelihood scripts already do).pyproject.toml:nufftaxadded tooptionalanddevextras alongsidepynufft.Behavior change
TransformerNUFFT.image_fromnow returns the strict mathematical adjoint of the forward (matchesTransformerDFT.image_fromexactly). The legacy pynufft class included internal Kaiser-Bessel kernel deconvolution and IFFT normalization — the absolute scale and sign of dirty images differ from before. UseTransformerNUFFTPyNUFFTif you need legacy adjoint behaviour.Companion PRs
TransformerNUFFTPyNUFFT(forthcoming PR onfeature/nufftax-default-transformer).TransformerNUFFTPyNUFFT(forthcoming PR onfeature/nufftax-default-transformer).These should merge after this PR is merged.
Scripts Changed
None in this repo. Workspace parity script
autolens_workspace_test/scripts/interferometer/nufft.pyalready exists and now reports 0.0 residual between the new class and its inline nufftax helper.Test plan
pytest test_autoarray/— 750 / 750 passedapply_sparse_operator: new class raises, DFT and legacy pass through unchangedaa.X;TransformerUnion extendedblack --checkclean🤖 Generated with Claude Code