Skip to content

Commit bd721be

Browse files
authored
edits to blog post (#1766)
Signed-off-by: Chris Abraham <[email protected]>
1 parent b65d7a6 commit bd721be

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

_posts/2021-6-8-overview-of-pytorch-autograd-engine.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ In the example above, when multiplying x and y to obtain v, the engine will exte
3737
<p>Figure 2: Computational graph extended after executing the logarithm</p>
3838
</div>
3939

40-
Continuing, the engine now calculates the <a href="https://www.codecogs.com/eqnedit.php?latex=log(v)" target="_blank"><img src="https://latex.codecogs.com/gif.latex?log(v)" title="log(v)" /></a> operation and extends the graph again with the log derivative that it knows to be <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{1}{v}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{1}{v}" title="\frac{1}{v}" /></a>. This is shown in figure 3. This operation generates the result <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;v}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;v}" title="\frac{\partial w}{\partial v}" /></a> that when propagated backward and multiplied by the multiplication derivative as in the chain rule, generates the derivatives <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;x}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;x}" title="\frac{\partial w}{\partial x}" /></a>, <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;x}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;x}" title="\frac{\partial w}{\partial x}" /></a>.
40+
Continuing, the engine now calculates the <a href="https://www.codecogs.com/eqnedit.php?latex=log(v)" target="_blank"><img src="https://latex.codecogs.com/gif.latex?log(v)" title="log(v)" /></a> operation and extends the graph again with the log derivative that it knows to be <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{1}{v}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{1}{v}" title="\frac{1}{v}" /></a>. This is shown in figure 3. This operation generates the result <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;v}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;v}" title="\frac{\partial w}{\partial v}" /></a> that when propagated backward and multiplied by the multiplication derivative as in the chain rule, generates the derivatives <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;x}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;x}" title="\frac{\partial w}{\partial x}" /></a>, <a href="https://www.codecogs.com/eqnedit.php?latex=\frac{\partial&space;w}{\partial&space;y}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\frac{\partial&space;w}{\partial&space;y}" title="\frac{\partial w}{\partial y}" /></a>.
4141

4242
<div class="text-center">
4343
<img src="{{ site.baseurl }}/assets/images/extended_computational_graph.png" width="100%">
@@ -111,7 +111,7 @@ We can execute the same expression in PyTorch and calculate the gradient of the
111111
<pre>>>> y.backward(1.0)</pre>
112112
<pre>>>> x.grad</pre>
113113
tensor([1.3633,
114-
0.1912])</pre>
114+
0.1912])
115115
</div>
116116

117117
The result is the same as our hand-calculated Jacobian-vector product!

0 commit comments

Comments
 (0)