scipy.special.kl_div#
- scipy.special.kl_div(x, y, out=None) = <ufunc 'kl_div'>#
Elementwise function for computing Kullback-Leibler divergence.
\[\begin{split}\mathrm{kl\_div}(x, y) = \begin{cases} x \log(x / y) - x + y & x > 0, y > 0 \\ y & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}\]- Parameters:
- x, yarray_like
Real arguments
- outndarray, optional
Optional output array for the function results
- Returns:
- scalar or ndarray
Values of the Kullback-Liebler divergence.
See also
Notes
Added in version 0.15.0.
This function is non-negative and is jointly convex in x and y.
The origin of this function is in convex programming; see [1] for details. This is why the function contains the extra \(-x + y\) terms over what might be expected from the Kullback-Leibler divergence. For a version of the function without the extra terms, see
rel_entr
.kl_div
has experimental support for Python Array API Standard compatible backends in addition to NumPy. Please consider testing these features by setting an environment variableSCIPY_ARRAY_API=1
and providing CuPy, PyTorch, JAX, or Dask arrays as array arguments. The following combinations of backend and device (or other capability) are supported.Library
CPU
GPU
NumPy
✅
n/a
CuPy
n/a
✅
PyTorch
✅
⛔
JAX
✅
✅
Dask
✅
n/a
See Support for the array API standard for more information.
References
[1]Boyd, Stephen and Lieven Vandenberghe. Convex optimization. Cambridge University Press, 2004. DOI:https://doi.org/10.1017/CBO9780511804441