You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(Also, appendix js basics contains 3^2 in a comment, which is XOR in JS, so it is misleading; ** is the exponentiation operator there, or just use words.)
(So it wasn't my imagination; overall quality really did drop significantly after about the first third, with Bayes.)
…And while you're here…
There might be a few similarities between probabilistic and numerical programming/analysis, distributions and numbers (not in basic structure, but in how they were created and are used). Expertise in one area is likely to transition well to another.
In probabilistic programming, marginal distributions are reified/manifested with Infer.*, and are defined only by that. (A collection of simulating/inferring methods united by a common constraint: simulation and distribution (in/out) converge in a basic sense — when simulated (with forward inference with increasing repeat-count), difference decreases.)
In pure mathematics, numbers are manifested with Number.* (like computation or Newton's approximation), and are defined only by that. (A collection of approximating methods united by a common constraint: computation and result (in/out) converge in a basic sense — when approximated (by a float with increasing precision), difference decreases.)
In reinforcement learning and hormones, agents are manifested with RL.* (like temporal difference learning (dopamine, expectation and reality converge with time), or Q-learning (serotonin, best action and reality converge with time)), and are defined only by constraint to converge, not by structure.
In graphics, objects are rendered, and detail decreasing and rasterizing methods could be seen to need to converge to similar rasterization with resolution increase.
The ability to reverse computation direction for concepts of its domain; being able to insert Infer/Number in any place (like on every built-in math function call, to a standard precision, as is effectively done today) without changing semantics, only precision — shared between those. Probably caused by being constrained by conceptual convergence.
Somewhat nice, I guess.
The text was updated successfully, but these errors were encountered:
Syntax errors… Such a small thing, just from reading through.
(Also, appendix js basics contains
3^2
in a comment, which is XOR in JS, so it is misleading;**
is the exponentiation operator there, or just use words.)(So it wasn't my imagination; overall quality really did drop significantly after about the first third, with Bayes.)
…And while you're here…
There might be a few similarities between probabilistic and numerical programming/analysis, distributions and numbers (not in basic structure, but in how they were created and are used). Expertise in one area is likely to transition well to another.
In probabilistic programming, marginal distributions are reified/manifested with
Infer.*
, and are defined only by that. (A collection of simulating/inferring methods united by a common constraint: simulation and distribution (in/out) converge in a basic sense — when simulated (with forward inference with increasing repeat-count), difference decreases.)In pure mathematics, numbers are manifested with
Number.*
(like computation or Newton's approximation), and are defined only by that. (A collection of approximating methods united by a common constraint: computation and result (in/out) converge in a basic sense — when approximated (by a float with increasing precision), difference decreases.)In reinforcement learning and hormones, agents are manifested with
RL.*
(like temporal difference learning (dopamine, expectation and reality converge with time), or Q-learning (serotonin, best action and reality converge with time)), and are defined only by constraint to converge, not by structure.In graphics, objects are rendered, and detail decreasing and rasterizing methods could be seen to need to converge to similar rasterization with resolution increase.
The ability to reverse computation direction for concepts of its domain; being able to insert
Infer
/Number
in any place (like on every built-in math function call, to a standard precision, as is effectively done today) without changing semantics, only precision — shared between those. Probably caused by being constrained by conceptual convergence.Somewhat nice, I guess.
The text was updated successfully, but these errors were encountered: