Open Access Open Access  Restricted Access Subscription or Fee Access

A memory gradient method with a closed-form stepsize formula

Haichan Lin, Yigui Ou


This paper presents a new memory gradient method for unconstrained optimization problems. The feature of this method is that it uses a closed-form stepsize formula to compute the stepsize without any stoping condition, thus broadening the convergence domain. Under some assumptions, the global and local convergence properties of the proposed method are established. Numerical results are also reported that show the efficiency of this proposed method.


unconstrained optimization, memory gradient method, closed-form stepsize formula, convergence analysis.

Full Text:


Disclaimer/Regarding indexing issue:

We have provided the online access of all issues and papers to the indexing agencies (as given on journal web site). It’s depend on indexing agencies when, how and what manner they can index or not. Hence, we like to inform that on the basis of earlier indexing, we can’t predict the today or future indexing policy of third party (i.e. indexing agencies) as they have right to discontinue any journal at any time without prior information to the journal. So, please neither sends any question nor expects any answer from us on the behalf of third party i.e. indexing agencies.Hence, we will not issue any certificate or letter for indexing issue. Our role is just to provide the online access to them. So we do properly this and one can visit indexing agencies website to get the authentic information. Also: DOI is paid service which provided by a third party. We never mentioned that we go for this for our any journal. However, journal have no objection if author go directly for this paid DOI service.