You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -249,7 +248,9 @@ As you can see by the red shading, the posterior of the GP prior over the functi
249
248
250
249
### Prediction using `.conditional`
251
250
252
-
Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_posterior_predictive` function. This is similar to how Stan uses its `generated quantities {...}` block. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps.
251
+
Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_posterior_predictive` function.
252
+
253
+
This is similar to how Stan uses its `generated quantities {...}` block. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps.
253
254
254
255
```{code-cell} ipython3
255
256
---
@@ -259,14 +260,12 @@ jupyter:
259
260
n_new = 200
260
261
X_new = np.linspace(-4, 14, n_new)[:, None]
261
262
262
-
# add the GP conditional to the model, given the new X values
263
263
with model:
264
+
# add the GP conditional to the model, given the new X values
0 commit comments