Jacobian Matrix in PyTorch



In this article we will learn about the Jacobian matrix and how to calculate this matrix using different methods in PyTorch. We use Jacobian matrix in various machine learning applications.

Jacobian Matrix

We use Jacobian matrix to calculate relation between the input and output variable. Jacobian matrix has all the partial derivatives of vector valued function. We can use this matrix in various applications machine learning applications. Here are some of its usages

  • For analyzing the gradients and derivatives of functions in multivariable calculus.

  • Solving differential equations of systems.

  • Calculating inverse of vector-values functions.

  • Analyzing stability of dynamic systems.

Calculation of Jacobian matrix in PyTorch

Firstly we have to install the pytorch module using the following command

pip install torch

To calculate the jacobian matrix value we use the function provided by the PyTorch torch.autograd.functional.jacobian(), it is used to calculate the jacobian value of any given function. This function takes argument as following

  • func This is a python function which takes tensor as input and returns tensor as output or tuples of tensors after performing some operation on the input tensor.

  • inputs This is input which we want to send into the func function, it can be a tensor or tuples of tensors.

Let's see different programs to calculate the jacobian matrix.

Example1: Creating Simple Matrix and Calculating Jacobian.

import torch

mat = torch.tensor([[1.0, 2.0],[3.0, 4.0]])
jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), mat)
print("Jacobian Matrix:")
print(jacobian)

Output

Jacobian Matrix: tensor([[1., 1.], [1., 1.]])

Explanation

Here in the above program we created 2*2 matrix using tensor function provided by PyTorch. We use jacobian function from torch.autograd.functional for calculation of the jacobian matrix.lambda val: val.sum( takes the matrix as input and computes the sum of the matrix.

Example2: Calculating Jacobian for Matrix Multiplication Function.

import torch

def mat_mul(A, B):
   return torch.mm(A, B)
mat1 = torch.tensor([[2.0, 3.0],[4.0, 5.0]])
mat2 = torch.tensor([[1.0, 2.0],[3.0, 4.0]])
jacobian = torch.autograd.functional.jacobian(lambda x: mat_mul(mat1, x), mat2)
print("Jacobian Matrix:")
print(jacobian)

Output

Jacobian Matrix: tensor([[[[2., 0.], [3., 0.]], [[0., 2.], [0., 3.]]], [[[4., 0.], [5., 0.]], [[0., 4.], [0., 5.]]]])

Explanation

Here in the above program we create two matrices mat1, mat2 using torch.mm function which is provided by PyTorch. We use jacobian function to calculate Jacobian matrix using the function lambda x: mat_mul(mat1, x), mat2.

Example3: Calculating Jacobian by Creating Random Matrix.

import torch

rand_matrix = torch.randn((3, 2))
jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), rand_matrix)
print("Jacobian Matrix:")
print(jacobian)

Output

Jacobian Matrix: tensor([[1., 1.], [1., 1.], [1., 1.]])

Explanation

Here in the above program we create 3*2 random matrix using torch.randn function which creates values using standard normal distribution. We used the jacobian function torch.autograd.functional to calculate the jacobian matrix element sum.

Example4: Jacobian for Matrix Squaring Function.

import torch

def mat_sq(M):
   return torch.mm(M, M)
mat= torch.tensor([[2.0, 3.0],[4.0, 5.0]])
jacobian = torch.autograd.functional.jacobian(lambda ele: mat_sq(ele), mat)
print("Jacobian Matrix:")
print(jacobian)

Output

Jacobian Matrix: 
tensor([[[[ 4., 4.], 
          [ 3., 0.]], 

         [[ 3., 7.], 
          [ 0., 3.]]], 

        [[[ 4., 0.], 
          [ 7., 4.]], 

         [[ 0., 4.], 
          [ 3., 10.]]]])

Explanation

Here in the above program we created function named mat_sq which is used to create matrix squaring using function torch.mm given by PyTorch. We create 2*2 matrix using tensor function. We used Jacobian function for calculating Jacobian matrix using the functionlambda ele: matrix_square(ele).

Example5: Calculate Jacobian of 2D Numpy Array Matrix.

import torch
import numpy as np

array = np.array([[1.0, 2.0, 3.0],[4.0, 5.0, 6.0]])
mat = torch.tensor(array)
jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), mat)
print("Jacobian Matrix:")
print(jacobian)

Output

Jacobian Matrix: tensor([[1., 1., 1.], [1., 1., 1.]], dtype=torch.float64)

Explanation

Here in the above program we converted 2D numpy array into PyTorch tensor using torch.tensor. We used Jacobian function from torch.autograd.functional for calculation of the jacobian matrix. We pass the sum function and matrix as input parameters.

So, w? saw what is th? Jacobian matrix in th? PyTorch and how w? can calculate th? Jacobian matrix value in different cas?s. W? calculat?d th? Jacobian valu? using various programs and by cr?ating th? matrix using PyTorch. We can change th? valu? and functions and us? th? Jacobian matrix in optimization of machin? l?arning tasks.

Updated on: 2023-10-03T15:13:34+05:30

346 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements