Skip to content

Commit 6b607f9

Browse files
authored
Fix typos and rerun (pymc-devs#664)
1 parent 475a1b4 commit 6b607f9

File tree

2 files changed

+232
-160
lines changed

2 files changed

+232
-160
lines changed

examples/gaussian_processes/GP-Latent.ipynb

+220-147
Large diffs are not rendered by default.

examples/gaussian_processes/GP-Latent.myst.md

+12-13
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@ jupytext:
55
format_name: myst
66
format_version: 0.13
77
kernelspec:
8-
display_name: Python 3 (ipykernel)
8+
display_name: pymc-examples
99
language: python
10-
name: python3
10+
name: pymc-examples
1111
myst:
1212
substitutions:
1313
extra_dependencies: jax numpyro
@@ -119,14 +119,13 @@ cov_func = eta_true**2 * pm.gp.cov.ExpQuad(1, ell_true)
119119
mean_func = pm.gp.mean.Zero()
120120
121121
# The latent function values are one sample from a multivariate normal
122-
# Note that we have to call `eval()` because PyMC built on top of Theano
123122
f_true = pm.draw(pm.MvNormal.dist(mu=mean_func(X), cov=cov_func(X)), 1, random_seed=rng)
124123
125124
# The observed data is the latent function plus a small amount of T distributed noise
126125
# The standard deviation of the noise is `sigma`, and the degrees of freedom is `nu`
127126
sigma_true = 1.0
128127
nu_true = 5.0
129-
y = f_true + sigma_true * rng.normal(size=n)
128+
y = f_true + sigma_true * rng.standard_t(df=nu_true, size=n)
130129
131130
## Plot the data and the unobserved latent function
132131
fig = plt.figure(figsize=(10, 4))
@@ -164,7 +163,7 @@ with pm.Model() as model:
164163
) # add one because student t is undefined for degrees of freedom less than one
165164
y_ = pm.StudentT("y", mu=f, lam=1.0 / sigma, nu=nu, observed=y)
166165
167-
idata = pm.sample(1000, tune=1000, chains=2, cores=2, nuts_sampler="numpyro")
166+
idata = pm.sample(nuts_sampler="numpyro")
168167
```
169168

170169
```{code-cell} ipython3
@@ -249,7 +248,9 @@ As you can see by the red shading, the posterior of the GP prior over the functi
249248

250249
### Prediction using `.conditional`
251250

252-
Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_posterior_predictive` function. This is similar to how Stan uses its `generated quantities {...}` block. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps.
251+
Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_posterior_predictive` function.
252+
253+
This is similar to how Stan uses its `generated quantities {...}` block. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps.
253254

254255
```{code-cell} ipython3
255256
---
@@ -259,14 +260,12 @@ jupyter:
259260
n_new = 200
260261
X_new = np.linspace(-4, 14, n_new)[:, None]
261262
262-
# add the GP conditional to the model, given the new X values
263263
with model:
264+
# add the GP conditional to the model, given the new X values
264265
f_pred = gp.conditional("f_pred", X_new, jitter=1e-4)
265266
266-
# Sample from the GP conditional distribution
267-
with model:
268-
ppc = pm.sample_posterior_predictive(idata.posterior, var_names=["f_pred"])
269-
idata.extend(ppc)
267+
# Sample from the GP conditional distribution
268+
idata.extend(pm.sample_posterior_predictive(idata, var_names=["f_pred"]))
270269
```
271270

272271
```{code-cell} ipython3
@@ -394,8 +393,7 @@ with model:
394393
p_pred = pm.Deterministic("p_pred", pm.math.invlogit(f_pred))
395394
396395
with model:
397-
ppc = pm.sample_posterior_predictive(idata.posterior, var_names=["f_pred", "p_pred"])
398-
idata.extend(ppc)
396+
idata.extend(pm.sample_posterior_predictive(idata.posterior, var_names=["f_pred", "p_pred"]))
399397
```
400398

401399
```{code-cell} ipython3
@@ -436,6 +434,7 @@ plt.legend(loc=(0.32, 0.65), frameon=True);
436434
* Reexecuted by [Colin Caroll](https://github.com/ColCarroll) in 2019 ([pymc#3397](https://github.com/pymc-devs/pymc/pull/3397))
437435
* Updated for V4 by Bill Engels in September 2022 ([pymc-examples#237](https://github.com/pymc-devs/pymc-examples/pull/237))
438436
* Updated for V5 by Chris Fonnesbeck in July 2023 ([pymc-examples#549](https://github.com/pymc-devs/pymc-examples/pull/549))
437+
* Updated by [Alexandre Andorra](https://github.com/AlexAndorra) in May 2024
439438

440439
+++
441440

0 commit comments

Comments
 (0)