File tree Expand file tree Collapse file tree 4 files changed +4
-4
lines changed Expand file tree Collapse file tree 4 files changed +4
-4
lines changed Original file line number Diff line number Diff line change @@ -56,7 +56,7 @@ additional comments::
56
56
# This function has only a single output, so it gets only one gradient
57
57
def backward(self, grad_output):
58
58
# This is a pattern that is very convenient - at the top of backward
59
- # unpack saved_tensors and initialize all gradients w.r.t. inputs to
59
+ # unpack saved_variables and initialize all gradients w.r.t. inputs to
60
60
# None. Thanks to the fact that additional trailing Nones are
61
61
# ignored, the return statement is simple even when the function has
62
62
# optional inputs.
Original file line number Diff line number Diff line change @@ -56,7 +56,7 @@ additional comments::
56
56
@staticmethod
57
57
def backward(ctx, grad_output):
58
58
# This is a pattern that is very convenient - at the top of backward
59
- # unpack saved_tensors and initialize all gradients w.r.t. inputs to
59
+ # unpack saved_variables and initialize all gradients w.r.t. inputs to
60
60
# None. Thanks to the fact that additional trailing Nones are
61
61
# ignored, the return statement is simple even when the function has
62
62
# optional inputs.
Original file line number Diff line number Diff line change @@ -56,7 +56,7 @@ additional comments::
56
56
@staticmethod
57
57
def backward(ctx, grad_output):
58
58
# This is a pattern that is very convenient - at the top of backward
59
- # unpack saved_tensors and initialize all gradients w.r.t. inputs to
59
+ # unpack saved_variables and initialize all gradients w.r.t. inputs to
60
60
# None. Thanks to the fact that additional trailing Nones are
61
61
# ignored, the return statement is simple even when the function has
62
62
# optional inputs.
Original file line number Diff line number Diff line change @@ -56,7 +56,7 @@ additional comments::
56
56
@staticmethod
57
57
def backward(ctx, grad_output):
58
58
# This is a pattern that is very convenient - at the top of backward
59
- # unpack saved_tensors and initialize all gradients w.r.t. inputs to
59
+ # unpack saved_variables and initialize all gradients w.r.t. inputs to
60
60
# None. Thanks to the fact that additional trailing Nones are
61
61
# ignored, the return statement is simple even when the function has
62
62
# optional inputs.
You can’t perform that action at this time.
0 commit comments