_get_gradient
get_gradient
¶
Calculate gradients of a target w.r.t sources.
This method can be used with TensorFlow tensors:
x = tf.Variable([1.0, 2.0, 3.0])
with tf.GradientTape(persistent=True) as tape:
y = x * x
b = fe.backend.get_gradient(target=y, sources=x, tape=tape) # [2.0, 4.0, 6.0]
b = fe.backend.get_gradient(target=b, sources=x, tape=tape) # None
b = fe.backend.get_gradient(target=y, sources=x, tape=tape, higher_order=True) # [2.0, 4.0, 6.0]
b = fe.backend.get_gradient(target=b, sources=x, tape=tape) # [2.0, 2.0, 2.0]
This method can be used with PyTorch tensors:
x = torch.tensor([1.0, 2.0, 3.0], requires_grad=True)
y = x * x
b = fe.backend.get_gradient(target=y, sources=x) # [2.0, 4.0, 6.0]
b = fe.backend.get_gradient(target=b, sources=x) # Error - b does not have a backwards function
b = fe.backend.get_gradient(target=y, sources=x, higher_order=True) # [2.0, 4.0, 6.0]
b = fe.backend.get_gradient(target=b, sources=x) # [2.0, 2.0, 2.0]
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target |
Tensor
|
The target (final) tensor. |
required |
sources |
Union[Iterable[Tensor], Tensor]
|
A sequence of source (initial) tensors. |
required |
higher_order |
bool
|
Whether the gradient will be used for higher order gradients. |
False
|
tape |
Optional[GradientTape]
|
TensorFlow gradient tape. Only needed when using the TensorFlow backend. |
None
|
retain_graph |
bool
|
Whether to retain PyTorch graph. Only valid when using the PyTorch backend. |
True
|
Returns:
Type | Description |
---|---|
Union[Iterable[Tensor], Tensor]
|
Gradient(s) of the |
Raises:
Type | Description |
---|---|
ValueError
|
If |