Participer au site avec un Tip
Rechercher
 

Améliorations / Corrections

Vous avez des améliorations (ou des corrections) à proposer pour ce document : je vous remerçie par avance de m'en faire part, cela m'aide à améliorer le site.

Emplacement :

Description des améliorations :

Module « scipy.special »

Fonction kl_div - module scipy.special

Signature de la fonction kl_div

Description

kl_div.__doc__

kl_div(x1, x2, /, out=None, *, where=True, casting='same_kind', order='K', dtype=None, subok=True[, signature, extobj])

kl_div(x, y, out=None)

Elementwise function for computing Kullback-Leibler divergence.

.. math::

    \mathrm{kl\_div}(x, y) =
      \begin{cases}
        x \log(x / y) - x + y & x > 0, y > 0 \\
        y & x = 0, y \ge 0 \\
        \infty & \text{otherwise}
      \end{cases}

Parameters
----------
x, y : array_like
    Real arguments
out : ndarray, optional
    Optional output array for the function results

Returns
-------
scalar or ndarray
    Values of the Kullback-Liebler divergence.

See Also
--------
entr, rel_entr

Notes
-----
.. versionadded:: 0.15.0

This function is non-negative and is jointly convex in `x` and `y`.

The origin of this function is in convex programming; see [1]_ for
details. This is why the the function contains the extra :math:`-x
+ y` terms over what might be expected from the Kullback-Leibler
divergence. For a version of the function without the extra terms,
see `rel_entr`.

References
----------
.. [1] Grant, Boyd, and Ye, "CVX: Matlab Software for Disciplined Convex
    Programming", http://cvxr.com/cvx/