You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Remove spurious minus signs from the first two code examples in the Potential
function
* Shorten warning about applicability of Potential only to logp-based sampling
Copy file name to clipboardExpand all lines: pymc/model.py
+5-11Lines changed: 5 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -2049,13 +2049,7 @@ def Potential(name, var, model=None, dims=None):
2049
2049
2050
2050
Warnings
2051
2051
--------
2052
-
Potential functions only influence logp based sampling, like the one used by ``pm.sample``.
2053
-
Potentials, modify the log-probability of the model by adding a contribution to the logp which is used by sampling algorithms which rely on the information about the observed data to generate posterior samples.
2054
-
Potentials are not applicable in the context of forward sampling because they don't affect the prior distribution itself, only the computation of the logp.
2055
-
Forward sampling algorithms generate sample points from the prior distribution of the model, without taking into account the likelihood function.
2056
-
In other words, it does not use the information about the observed data.
2057
-
Hence, Potentials do not affect forward sampling, which is used by ``sample_prior_predictive`` and ``sample_posterior_predictive``.
2058
-
A warning saying "The effect of Potentials on other parameters is ignored during prior predictive sampling" is always emitted to alert user of this.
2052
+
Potential functions only influence logp-based sampling. Therefore, they are applicable for sampling with ``pm.sample`` but not ``pm.sample_prior_predictive`` or ``pm.sample_posterior_predictive``.
2059
2053
2060
2054
Parameters
2061
2055
----------
@@ -2077,7 +2071,7 @@ def Potential(name, var, model=None, dims=None):
2077
2071
Have a look at the following example:
2078
2072
2079
2073
In this example, we define a constraint on ``x`` to be greater or equal to 0 via the ``pm.Potential`` function.
2080
-
We pass ``-pm.math.log(pm.math.switch(constraint, 1, 0))`` as second argument which will return an expression depending on if the constraint is met or not and which will be added to the likelihood of the model.
2074
+
We pass ``pm.math.log(pm.math.switch(constraint, 1, 0))`` as second argument which will return an expression depending on if the constraint is met or not and which will be added to the likelihood of the model.
2081
2075
The probablity density that this model produces agrees strongly with the constraint that ``x`` should be greater than or equal to 0. All the cases who do not satisfy the constraint are strictly not considered.
2082
2076
2083
2077
.. code:: python
@@ -2086,9 +2080,9 @@ def Potential(name, var, model=None, dims=None):
However, if we use ``-pm.math.log(pm.math.switch(constraint, 1, 0.5))`` the potential again penalizes the likelihood when constraint is not met but with some deviations allowed.
2085
+
However, if we use ``pm.math.log(pm.math.switch(constraint, 1.0, 0.5))`` the potential again penalizes the likelihood when constraint is not met but with some deviations allowed.
2092
2086
Here, Potential function is used to pass a soft constraint.
2093
2087
A soft constraint is a constraint that is only partially satisfied.
2094
2088
The effect of this is that the posterior probability for the parameters decreases as they move away from the constraint, but does not become exactly zero.
@@ -2100,7 +2094,7 @@ def Potential(name, var, model=None, dims=None):
0 commit comments