Skip to content

Commit bf436ad

Browse files
committed
Generate Python docs from pytorch/pytorch@b241698
1 parent 9656735 commit bf436ad

File tree

1,700 files changed

+2480
-2367
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,700 files changed

+2480
-2367
lines changed

docs/master/__config__.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@
193193
<div class="pytorch-left-menu-search">
194194

195195
<div class="version">
196-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
196+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
197197
</div>
198198

199199

docs/master/_modules/index.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch.html

+45-45
Large diffs are not rendered by default.

docs/master/_modules/torch/__config__.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/_jit_internal.html

+3-3
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

@@ -899,7 +899,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
899899
<span class="k">return</span> <span class="n">fn</span></div>
900900

901901

902-
<span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
902+
<div class="viewcode-block" id="unused"><a class="viewcode-back" href="../../generated/torch.jit.unused.html#torch.jit.unused">[docs]</a><span class="k">def</span> <span class="nf">unused</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>
903903
<span class="sd">&quot;&quot;&quot;</span>
904904
<span class="sd"> This decorator indicates to the compiler that a function or method should</span>
905905
<span class="sd"> be ignored and replaced with the raising of an exception. This allows you</span>
@@ -946,7 +946,7 @@ <h1>Source code for torch._jit_internal</h1><div class="highlight"><pre>
946946
<span class="k">return</span> <span class="n">prop</span>
947947

948948
<span class="n">fn</span><span class="o">.</span><span class="n">_torchscript_modifier</span> <span class="o">=</span> <span class="n">FunctionModifiers</span><span class="o">.</span><span class="n">UNUSED</span>
949-
<span class="k">return</span> <span class="n">fn</span>
949+
<span class="k">return</span> <span class="n">fn</span></div>
950950

951951
<span class="c1"># No op context manager from python side</span>
952952
<span class="k">class</span> <span class="nc">_IgnoreContextManager</span><span class="p">(</span><span class="n">contextlib</span><span class="o">.</span><span class="n">AbstractContextManager</span><span class="p">):</span>

docs/master/_modules/torch/_lobpcg.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/_lowrank.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/_tensor.html

+13-13
Large diffs are not rendered by default.

docs/master/_modules/torch/_tensor_str.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/_utils.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/_vmap_internals.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/autograd.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/autograd/anomaly_mode.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

docs/master/_modules/torch/autograd/function.html

+13-13
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

@@ -403,7 +403,7 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
403403

404404
<span class="k">class</span> <span class="nc">_ContextMethodMixin</span><span class="p">(</span><span class="nb">object</span><span class="p">):</span>
405405

406-
<span class="k">def</span> <span class="nf">save_for_backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">tensors</span><span class="p">):</span>
406+
<div class="viewcode-block" id="_ContextMethodMixin.save_for_backward"><a class="viewcode-back" href="../../../generated/torch.autograd.function._ContextMethodMixin.save_for_backward.html#torch.autograd._ContextMethodMixin.save_for_backward">[docs]</a> <span class="k">def</span> <span class="nf">save_for_backward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">tensors</span><span class="p">):</span>
407407
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Saves given tensors for a future call to :func:`~Function.backward`.</span>
408408

409409
<span class="sd"> **This should be called at most once, and only from inside the**</span>
@@ -415,9 +415,9 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
415415

416416
<span class="sd"> Arguments can also be ``None``.</span>
417417
<span class="sd"> &quot;&quot;&quot;</span>
418-
<span class="bp">self</span><span class="o">.</span><span class="n">to_save</span> <span class="o">=</span> <span class="n">tensors</span>
418+
<span class="bp">self</span><span class="o">.</span><span class="n">to_save</span> <span class="o">=</span> <span class="n">tensors</span></div>
419419

420-
<span class="k">def</span> <span class="nf">mark_dirty</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">):</span>
420+
<div class="viewcode-block" id="_ContextMethodMixin.mark_dirty"><a class="viewcode-back" href="../../../generated/torch.autograd.function._ContextMethodMixin.mark_dirty.html#torch.autograd._ContextMethodMixin.mark_dirty">[docs]</a> <span class="k">def</span> <span class="nf">mark_dirty</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">):</span>
421421
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Marks given tensors as modified in an in-place operation.</span>
422422

423423
<span class="sd"> **This should be called at most once, only from inside the**</span>
@@ -428,15 +428,15 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
428428
<span class="sd"> It doesn&#39;t matter whether the function is called before or after</span>
429429
<span class="sd"> modification.</span>
430430
<span class="sd"> &quot;&quot;&quot;</span>
431-
<span class="bp">self</span><span class="o">.</span><span class="n">dirty_tensors</span> <span class="o">=</span> <span class="n">args</span>
431+
<span class="bp">self</span><span class="o">.</span><span class="n">dirty_tensors</span> <span class="o">=</span> <span class="n">args</span></div>
432432

433433
<span class="k">def</span> <span class="nf">mark_shared_storage</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">pairs</span><span class="p">):</span>
434434
<span class="n">warnings</span><span class="o">.</span><span class="n">warn</span><span class="p">(</span>
435435
<span class="s1">&#39;mark_shared_storage is deprecated. &#39;</span>
436436
<span class="s1">&#39;Tensors with shared storages are automatically tracked. Note &#39;</span>
437437
<span class="s1">&#39;that calls to `set_()` are not tracked&#39;</span><span class="p">)</span>
438438

439-
<span class="k">def</span> <span class="nf">mark_non_differentiable</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">):</span>
439+
<div class="viewcode-block" id="_ContextMethodMixin.mark_non_differentiable"><a class="viewcode-back" href="../../../generated/torch.autograd.function._ContextMethodMixin.mark_non_differentiable.html#torch.autograd._ContextMethodMixin.mark_non_differentiable">[docs]</a> <span class="k">def</span> <span class="nf">mark_non_differentiable</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">):</span>
440440
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Marks outputs as non-differentiable.</span>
441441

442442
<span class="sd"> **This should be called at most once, only from inside the**</span>
@@ -450,17 +450,17 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
450450

451451
<span class="sd"> This is used e.g. for indices returned from a max :class:`Function`.</span>
452452
<span class="sd"> &quot;&quot;&quot;</span>
453-
<span class="bp">self</span><span class="o">.</span><span class="n">non_differentiable</span> <span class="o">=</span> <span class="n">args</span>
453+
<span class="bp">self</span><span class="o">.</span><span class="n">non_differentiable</span> <span class="o">=</span> <span class="n">args</span></div>
454454

455-
<span class="k">def</span> <span class="nf">set_materialize_grads</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">value</span><span class="p">):</span>
455+
<div class="viewcode-block" id="_ContextMethodMixin.set_materialize_grads"><a class="viewcode-back" href="../../../generated/torch.autograd.function._ContextMethodMixin.set_materialize_grads.html#torch.autograd._ContextMethodMixin.set_materialize_grads">[docs]</a> <span class="k">def</span> <span class="nf">set_materialize_grads</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">value</span><span class="p">):</span>
456456
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Sets whether to materialize output grad tensors. Default is true.</span>
457457

458458
<span class="sd"> **This should be called only from inside the** :func:`forward` **method**</span>
459459

460460
<span class="sd"> If true, undefined output grad tensors will be expanded to tensors full</span>
461461
<span class="sd"> of zeros prior to calling the :func:`backward` method.</span>
462462
<span class="sd"> &quot;&quot;&quot;</span>
463-
<span class="bp">self</span><span class="o">.</span><span class="n">materialize_grads</span> <span class="o">=</span> <span class="n">value</span>
463+
<span class="bp">self</span><span class="o">.</span><span class="n">materialize_grads</span> <span class="o">=</span> <span class="n">value</span></div>
464464

465465
<span class="k">class</span> <span class="nc">_HookMixin</span><span class="p">(</span><span class="nb">object</span><span class="p">):</span>
466466

@@ -547,7 +547,7 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
547547
<span class="c1"># for the tracer</span>
548548
<span class="n">is_traceable</span> <span class="o">=</span> <span class="kc">False</span>
549549

550-
<span class="nd">@staticmethod</span>
550+
<div class="viewcode-block" id="Function.forward"><a class="viewcode-back" href="../../../generated/torch.autograd.Function.forward.html#torch.autograd.Function.forward">[docs]</a> <span class="nd">@staticmethod</span>
551551
<span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="n">ctx</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">:</span> <span class="n">Any</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Any</span><span class="p">:</span>
552552
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Performs the operation.</span>
553553

@@ -560,9 +560,9 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
560560
<span class="sd"> retrieved during the backward pass.</span>
561561
<span class="sd"> &quot;&quot;&quot;</span>
562562
<span class="k">raise</span> <span class="ne">NotImplementedError</span><span class="p">(</span><span class="s2">&quot;You must implement the forward function for custom&quot;</span>
563-
<span class="s2">&quot; autograd.Function.&quot;</span><span class="p">)</span>
563+
<span class="s2">&quot; autograd.Function.&quot;</span><span class="p">)</span></div>
564564

565-
<span class="nd">@staticmethod</span>
565+
<div class="viewcode-block" id="Function.backward"><a class="viewcode-back" href="../../../generated/torch.autograd.Function.backward.html#torch.autograd.Function.backward">[docs]</a> <span class="nd">@staticmethod</span>
566566
<span class="k">def</span> <span class="nf">backward</span><span class="p">(</span><span class="n">ctx</span><span class="p">:</span> <span class="n">Any</span><span class="p">,</span> <span class="o">*</span><span class="n">grad_outputs</span><span class="p">:</span> <span class="n">Any</span><span class="p">)</span> <span class="o">-&gt;</span> <span class="n">Any</span><span class="p">:</span>
567567
<span class="sa">r</span><span class="sd">&quot;&quot;&quot;Defines a formula for differentiating the operation.</span>
568568

@@ -585,7 +585,7 @@ <h1>Source code for torch.autograd.function</h1><div class="highlight"><pre>
585585
<span class="sd"> output.</span>
586586
<span class="sd"> &quot;&quot;&quot;</span>
587587
<span class="k">raise</span> <span class="ne">NotImplementedError</span><span class="p">(</span><span class="s2">&quot;You must implement the backward function for custom&quot;</span>
588-
<span class="s2">&quot; autograd.Function.&quot;</span><span class="p">)</span></div>
588+
<span class="s2">&quot; autograd.Function.&quot;</span><span class="p">)</span></div></div>
589589

590590

591591
<span class="k">def</span> <span class="nf">once_differentiable</span><span class="p">(</span><span class="n">fn</span><span class="p">):</span>

docs/master/_modules/torch/autograd/functional.html

+1-1
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@
192192
<div class="pytorch-left-menu-search">
193193

194194
<div class="version">
195-
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+git145a20b ) &#x25BC</a>
195+
<a href='https://pytorch.org/docs/versions.html'>master (1.10.0a0+gitb241698 ) &#x25BC</a>
196196
</div>
197197

198198

0 commit comments

Comments
 (0)