Skip to content

Commit a460468

Browse files
authored
Fix a few sphinx warnings for new doc build (#3702)
1 parent ece47e3 commit a460468

File tree

8 files changed

+17
-23
lines changed

8 files changed

+17
-23
lines changed

docs/source/api.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,6 @@ API Reference
1414
api/gp
1515
api/plots
1616
api/stats
17-
api/diagnostics
1817
api/backends
1918
api/math
2019
api/data

docs/source/api/diagnostics.rst

Lines changed: 0 additions & 8 deletions
This file was deleted.

docs/source/api/distributions/transforms.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,7 @@ initialized instances of the Transform Classes, which are described
4040
below.
4141

4242
.. glossary::
43+
4344
``stick_breaking``
4445
Instantiation of :class:`~pymc3.distributions.transforms.StickBreaking`
4546
:class:`~pymc3.distributions.transforms.Transform` class for use in the ``transform``
@@ -93,7 +94,7 @@ below.
9394
for use in the ``transform`` argument of a random variable.
9495

9596

96-
.. autofunction:: t_stick_breaking
97+
.. autofunction:: t_stick_breaking
9798

9899

99100
Transform Base Classes

docs/source/api/inference.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,9 +54,9 @@ Hamiltonian Monte Carlo
5454
Sequential Monte Carlo
5555
~~~~~~~~~~~~~~~~~~~~~~~
5656

57-
.. currentmodule:: pymc3.step_methods.smc
57+
.. currentmodule:: pymc3.step_methods.smc.smc
5858

59-
.. autoclass:: pymc3.step_methods.smc.SMC
59+
.. autoclass:: pymc3.smc.smc.SMC
6060
:members:
6161

6262

docs/source/conf.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,7 @@
4040
"nbsphinx",
4141
"numpydoc",
4242
"IPython.sphinxext.ipython_console_highlighting",
43+
"IPython.sphinxext.ipython_directive",
4344
"sphinx.ext.autosectionlabel",
4445
"sphinx.ext.napoleon",
4546
"gallery_generator",

docs/source/developer_guide.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -906,7 +906,7 @@ TFP, which is a tenor in tensor out function. Moreover, transition
906906
kernels in TFP do not flatten the tensors, see eg docstring of
907907
`tensorflow\_probability/python/mcmc/random\_walk\_metropolis.py <https://github.com/tensorflow/probability/blob/master/tensorflow_probability/python/mcmc/random_walk_metropolis.py>`__:
908908
909-
.. code:: python
909+
.. code::
910910
911911
new_state_fn: Python callable which takes a list of state parts and a
912912
seed; returns a same-type `list` of `Tensor`s, each being a perturbation
@@ -945,7 +945,7 @@ where it builds the objective function by calling:
945945
946946
Where:
947947
948-
.. code:: python
948+
.. code::
949949
950950
op : Operator class
951951
approx : Approximation class or instance

docs/source/notebooks/PyMC3_tips_and_heuristic.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2325,7 +2325,7 @@
23252325
"You can find more details in the original [Stan case study](http://mc-stan.org/documentation/case-studies/mbjoseph-CARStan.html). You might come across similar constructs in Gaussian Process, which result in a zero-mean Gaussian distribution conditioned on a covariance function.\n",
23262326
"\n",
23272327
"In the `Stan` Code, matrix D is generated in the model using a `transformed data{}` block:\n",
2328-
"```Stan\n",
2328+
"```\n",
23292329
"transformed data{\n",
23302330
" vector[n] zeros;\n",
23312331
" matrix<lower = 0>[n, n] D;\n",
@@ -2359,7 +2359,7 @@
23592359
"metadata": {},
23602360
"source": [
23612361
"Then in the `Stan` model:\n",
2362-
"```stan\n",
2362+
"```\n",
23632363
"model {\n",
23642364
" phi ~ multi_normal_prec(zeros, tau * (D - alpha * W));\n",
23652365
" ...\n",
@@ -2475,7 +2475,7 @@
24752475
"Note that in the node $\\phi \\sim \\mathcal{N}(0, [D_\\tau (I - \\alpha B)]^{-1})$, we are computing the log-likelihood for a multivariate Gaussian distribution, which might not scale well in high-dimensions. We can take advantage of the fact that the covariance matrix here $[D_\\tau (I - \\alpha B)]^{-1}$ is **sparse**, and there are faster ways to compute its log-likelihood. \n",
24762476
"\n",
24772477
"For example, a more efficient sparse representation of the CAR in `STAN`:\n",
2478-
"```stan\n",
2478+
"```\n",
24792479
"functions {\n",
24802480
" /**\n",
24812481
" * Return the log probability of a proper conditional autoregressive (CAR) prior \n",
@@ -2513,7 +2513,7 @@
25132513
"}\n",
25142514
"```\n",
25152515
"with the data transformed in the model:\n",
2516-
"```stan\n",
2516+
"```\n",
25172517
"transformed data {\n",
25182518
" int W_sparse[W_n, 2]; // adjacency pairs\n",
25192519
" vector[n] D_sparse; // diagonal of D (number of neigbors for each site)\n",
@@ -2544,7 +2544,7 @@
25442544
"}\n",
25452545
"```\n",
25462546
"and the likelihood:\n",
2547-
"```stan\n",
2547+
"```\n",
25482548
"model {\n",
25492549
" phi ~ sparse_car(tau, alpha, W_sparse, D_sparse, lambda, n, W_n);\n",
25502550
"}\n",
@@ -2901,4 +2901,4 @@
29012901
},
29022902
"nbformat": 4,
29032903
"nbformat_minor": 2
2904-
}
2904+
}

pymc3/distributions/transforms.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -219,9 +219,10 @@ def backward_val(self, x):
219219
return np.log(1 + np.exp(-np.abs(x))) + np.max([x, 0])
220220

221221
def forward(self, x):
222-
"""Inverse operation of softplus
223-
y = Log(Exp(x) - 1)
224-
= Log(1 - Exp(-x)) + x
222+
"""Inverse operation of softplus.
223+
224+
y = Log(Exp(x) - 1)
225+
= Log(1 - Exp(-x)) + x
225226
"""
226227
return tt.log(1.0 - tt.exp(-x)) + x
227228

0 commit comments

Comments
 (0)