
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Compute the Jacobian of a Given Function in PyTorch
The jacobian() function computes the Jacobian of a given function. The jacobian() function can be accessed from the torch.autograd.functional module. The function whose Jacobian is being computed takes a tensor as the input and returns a tuple of tensors or a tensor. The jacobian() function returns a tensor with Jacobian values computed for a function with the given input.
Syntax
torch.autograd.functional.jacobian(func, input)
Parameters
func − It's a Python function for which the Jacobian is computed.
input − It’s the input to the function, func.
Steps
We could use the following steps to compute the Jacobian of a given function −
Import the required library. In all the following examples, the required Python libraries are torch. Make sure you have already installed it.
import torch from torch.autograd.functional import jacobian
Define a function func for which the Jacobian is to be calculated. The input to this function is input.
def func(x): return x**3 + 4*x -10
Define the tensor input to the function, func.
input = torch.tensor([2.,3.,4.])
Compute the Jacobian of the function defined above for the given input input.
output = jacobian(func, input)
Print the tensor containing the computed Jacobians.
print("Jacobians Tensor:
", output)
Example 1
# Import the required libraries import torch from torch.autograd.functional import jacobian # define a function def func(x): return x**3 + 4*x -10 # define the inputs input1 = torch.tensor([2.]) input2 = torch.tensor([2.,3.]) input3 = torch.tensor([2.,3.,4.]) # compute the jacobians output1 = jacobian(func, input1) output2 = jacobian(func, input2) output3 = jacobian(func, input3) # print the Jacobians calculated above print("Jacobian Tensor:
", output1) print("Jacobian Tensor:
", output2) print("Jacobian Tensor:
", output3)
Output
Jacobian Tensor: tensor([[16.]]) Jacobian Tensor: tensor([[16., 0.], [ 0., 31.]]) Jacobian Tensor: tensor([[16., 0., 0.], [ 0., 31., 0.], [ 0., 0., 52.]])
In the above example, we computed the Jacobians for a function for different inputs.
Example 2
import torch from torch.autograd.functional import jacobian # define a function def func(x,y): return x.pow(3) + y # here input is tuple of two tensors, one for x and other for y input1 = (torch.tensor([2.]), torch.tensor([5.])) input2 = (torch.tensor([2., 3., 4.]), torch.tensor([5., 6., 7.])) output1 = jacobian(func, input1) output2 = jacobian(func, input2) print(output1) print(output2)
Output
(tensor([[12.]]), tensor([[1.]])) (tensor([[12., 0., 0.], [ 0., 27., 0.], [ 0., 0., 48.]]), tensor([[1., 0., 0.], [0., 1., 0.], [0., 0., 1.]]))