-
Hello, I have tried changing priors (although I have read this isn't really best practice) to combat this, but didn't get satisfactory results. Are there any solutions to reducing divergences that I am missing or are divergences not as big of a deal as some make it seem? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @PK1706 Have a look at these Prior Choice Recommendations and see if you can decrease the number of divergences by selecting better priors for your use case. Also, try to increase the default params mmm.fit(X=X_train,
y=y_train,
target_accept=0.95,
chains=6,
random_seed=123,
draws=3_000,
tune=2_000,
) |
Beta Was this translation helpful? Give feedback.
Hi @PK1706
Have a look at these Prior Choice Recommendations and see if you can decrease the number of divergences by selecting better priors for your use case.
Also, try to increase the default params
chains
,draws
andtune
in thefit
method.