Minimize a function using a steepest gradient descent algorithm. This complements the collection of minimization routines provided in scipy.optimize. Steepest gradient iterations are cheaper than in the conjugate gradient or Newton methods, hence convergence may sometimes turn out faster algthough more iterations are typically needed.
Parameters : | f : callable
x0 : array
fprime : callable
xtol : float
ftol : float
maxiter : int
callback : callable
disp : bool
|
---|---|
Returns : | x : array
|