Participer au site avec un Tip
Rechercher
 

Améliorations / Corrections

Vous avez des améliorations (ou des corrections) à proposer pour ce document : je vous remerçie par avance de m'en faire part, cela m'aide à améliorer le site.

Emplacement :

Description des améliorations :

Vous êtes un professionnel et vous avez besoin d'une formation ? Coder avec une
Intelligence Artificielle
Voir le programme détaillé
Module « scipy.optimize »

Fonction fmin_ncg - module scipy.optimize

Signature de la fonction fmin_ncg

def fmin_ncg(f, x0, fprime, fhess_p=None, fhess=None, args=(), avextol=1e-05, epsilon=np.float64(1.4901161193847656e-08), maxiter=None, full_output=0, disp=1, retall=0, callback=None, c1=0.0001, c2=0.9) 

Description

help(scipy.optimize.fmin_ncg)

Unconstrained minimization of a function using the Newton-CG method.

Parameters
----------
f : callable ``f(x, *args)``
    Objective function to be minimized.
x0 : ndarray
    Initial guess.
fprime : callable ``f'(x, *args)``
    Gradient of f.
fhess_p : callable ``fhess_p(x, p, *args)``, optional
    Function which computes the Hessian of f times an
    arbitrary vector, p.
fhess : callable ``fhess(x, *args)``, optional
    Function to compute the Hessian matrix of f.
args : tuple, optional
    Extra arguments passed to f, fprime, fhess_p, and fhess
    (the same set of extra arguments is supplied to all of
    these functions).
epsilon : float or ndarray, optional
    If fhess is approximated, use this value for the step size.
callback : callable, optional
    An optional user-supplied function which is called after
    each iteration. Called as callback(xk), where xk is the
    current parameter vector.
avextol : float, optional
    Convergence is assumed when the average relative error in
    the minimizer falls below this amount.
maxiter : int, optional
    Maximum number of iterations to perform.
full_output : bool, optional
    If True, return the optional outputs.
disp : bool, optional
    If True, print convergence message.
retall : bool, optional
    If True, return a list of results at each iteration.
c1 : float, default: 1e-4
    Parameter for Armijo condition rule.
c2 : float, default: 0.9
    Parameter for curvature condition rule

Returns
-------
xopt : ndarray
    Parameters which minimize f, i.e., ``f(xopt) == fopt``.
fopt : float
    Value of the function at xopt, i.e., ``fopt = f(xopt)``.
fcalls : int
    Number of function calls made.
gcalls : int
    Number of gradient calls made.
hcalls : int
    Number of Hessian calls made.
warnflag : int
    Warnings generated by the algorithm.
    1 : Maximum number of iterations exceeded.
    2 : Line search failure (precision loss).
    3 : NaN result encountered.
allvecs : list
    The result at each iteration, if retall is True (see below).

See also
--------
minimize: Interface to minimization algorithms for multivariate
    functions. See the 'Newton-CG' `method` in particular.

Notes
-----
Only one of `fhess_p` or `fhess` need to be given.  If `fhess`
is provided, then `fhess_p` will be ignored. If neither `fhess`
nor `fhess_p` is provided, then the hessian product will be
approximated using finite differences on `fprime`. `fhess_p`
must compute the hessian times an arbitrary vector. If it is not
given, finite-differences on `fprime` are used to compute
it.

Newton-CG methods are also called truncated Newton methods. This
function differs from scipy.optimize.fmin_tnc because

1. scipy.optimize.fmin_ncg is written purely in Python using NumPy
    and scipy while scipy.optimize.fmin_tnc calls a C function.
2. scipy.optimize.fmin_ncg is only for unconstrained minimization
    while scipy.optimize.fmin_tnc is for unconstrained minimization
    or box constrained minimization. (Box constraints give
    lower and upper bounds for each variable separately.)

Parameters `c1` and `c2` must satisfy ``0 < c1 < c2 < 1``.

References
----------
Wright & Nocedal, 'Numerical Optimization', 1999, p. 140.



Vous êtes un professionnel et vous avez besoin d'une formation ? Coder avec une
Intelligence Artificielle
Voir le programme détaillé