NECULAI ANDREI Home Page.

 

ISI Thomson Papers (peer reviewed)

·        N. Andrei, A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization.
Numerical Algorithms, (Accepted, August 17, 2021)

·        N. Andrei, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization.
Optimization, (January 16, 2020) (
https://doi.org/10.1080/02331934.2020.1712391)

·        N. Andrei, New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method.
CALCOLO, vol. 57, Issue 2, June 2020, Article number 17. (http://doi.org/101007/s10092-020-00365-7)

·        N. Andrei, Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization.
Journal of Optimization Theory and Applications, vol. 185, Issue 3, 2020, pp. 859-879

·        N. Andrei, A diagonal quasi-Newton updating method for unconstrained optimization.
Numerical Algorithms.
vol. 81, pp. 575–590 (2019)

·        N. Andrei, A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization.
Numerical Functional Analysis and Optimization, Vol. 40, 2019, pp. 1467-1488.

·        N. Andrei, A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization.  (JOTA) Journal of Optimization Theory and Applications, 178(1) (2018) pp. 191-218.

·        N. Andrei, A double parameter scaled modified Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization.
Studies in Informatics and Control, Vol. 27, Issue 2, 2018, pp. 135-146

·        N. Andrei, A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization.
Optimization. A Journal of Mathematical Programming and Operations Research, vol.67, issue 9, 2018, pp. 1553-1568

·        N. Andrei, An adaptive scaled BFGS method for unconstrained optimization.
Numerical Algorithms. 77 (2) (2018) pp. 413-432. DOI: 10.1007/s11075-017-0321-1

·        N. Andrei, A double parameter scaled BFGS method for unconstrained optimization.
Journal of Computational and Applied Mathematics. 332 (2018) pp. 26-44.

·        N. Andrei, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues.
Numerical Algorithms 77 (4) (2018) pp. 1273-1282. DOI 10.1007/s11075-017-0362-5

·        N. Andrei, Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update.
Journal of Computational and Applied Mathematics, 325 (2017) pp. 149-164.

·        N. Andrei, Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization.
Optimization Methods and Software, vol. 32, No. 3, 2017, pp. 534-551.
DOI: 10.1080/10556788.2016.1225211.
[This paper is dedicated to Prof. Helmut Schwarz, Duisburg-Essen University, Germany, on the occasion of his 85th birthday.]

·        N. Andrei, An adaptive conjugate gradient algorithm for large-scale unconstrained optimization.
Journal of Computational and Applied Mathematics, vol. 292, 2016, pp. 83-91.

·        N. Andrei, A new three-term conjugate gradient algorithm for unconstrained optimization.
Numerical Agorithms, vol. 68, issue 2, 2015, pp. 305-321.

·        N. Andrei, A numerical study on efficiency and robustness of some conjugate gradient algorithms for large-scale unconstrained optimization.
Studies in Informatics and Control, vol. 22, Issue 4, December 2013, pp. 259-284.

·        N. Andrei, An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization.
Numerical Algorithms. vol. 65, issue 4, 2014, pp. 859-874.

·        N. Andrei, Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for large-scale unconstrained optimization.
Journal of Optimization Theory and Applications. vol. 159 (2013), pp. 159-182. DOI 10.1007/s10957-013-0285-9 (2013)

·        N. Andrei, On three-term conjugate gradient algorithms for unconstrained optimization.
Applied Mathematics and Computation, vol. 210, Issue 11, 1 February 2013, pp. 6316-6327.

·        N. Andrei, A simple three-term conjugate gradient algorithm for unconstrained optimization.
Journal of Computational and Applied Mathematics. vol. 241 (2013), pp. 19-29.
[This paper is dedicated to the memory of Neculai Gheorghiu (1930–2009), my professor of Mathematical Analysis, Faculty of Mathematics, " Alexandru Ioan Cuza" University, Iasi, Romania.]

·        N. Andrei, New accelerated conjugate gradient algorithms as a modification of Dai-Yuan's computational scheme for unconstrained optimization.
Journal of Computational and Applied Mathematics,
234 (2010), pp. 3397-3410.

·        N. Andrei, An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization.
Optimization Methods and Software, [Special issue in honour of Professor Florian A. Potra's 60th birthday]
Vol. 27, issue 4-5, (2012), pp. 583-604.

·        N. Andrei, A modified Polak-Ribiere-Polyak conjugate gradient algorithm for unconstrained optimization.
Optimization. A Journal of Mathematical Programming and Operations Research, Vol. 60, Issue 12 (2011), pp. 1457-1471.

·        N. Andrei, Open problems in conjugate gradient algorithms for unconstrained optimization.
Bulletin of the Malaysian Mathematical Sciences Society. (2) 34 (2) (2011), pp.319-330.

·        N. Andrei, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization.
European Journal of Operational Research. Vol. 204, Issue 3, (2010), pp.410-420.

·        N. Andrei, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization.
Numerical Algorithms, 54 (2010), pp.23-46.

·        N. Andrei, Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization.
Journal of Computational and Applied Mathematics, Vol. 230, Issue 2 (2009) 570-582.

·        N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization.
Applied Mathematics and Computation, Volume 213, Issue 2, 15 July 2009, pp. 361-369.

·        N. Andrei, Another nonlinear conjugate gradient algorithm for unconstrained optimization.
Optimization Methods and Software (OMS). Vol. 24, Issue. 1, February 2009, pp. 89-104.

·        N. Andrei, Hybrid conjugate gradient algorithm for unconstrained optimization.
Journal of Optimization Theory and Applications, Vol. 141, No. 2, May 2009, pp. 249-264.

·        N. Andrei,
A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy condition for unconstrained optimization.

Applied Mathematics Letters (AML), Volume 21, Issue 2, February 2008, pp. 165-171.

·        N. Andrei,  
Another hybrid conjugate gradient algorithm for unconstrained optimization.

Numerical Algorithms, Vol. 47 (2008), pp. 143-156

·        N. Andrei,
A scaled nonlinear conjugate gradient algorithm for unconstrained optimization.

A Journal of Mathematical Programming and Operations Research, Vol. 57, Issue 4, August 2008, pp. 549-570

·        N. Andrei,
Scaled memoryless BFGS preconditioned conjugate gradient algorithm for Unconstrained Optimization.

Optimization Methods and Software (OMS), Volume 22, Number 4, August 2007, pp. 561-571

·        N. Andrei,
A Scaled BFGS preconditioned conjugate gradient algorithm for Unconstrained Optimization.

Applied Mathematics Letters (AML), Vol. 20, Issue 6, (2007), pp. 645-650.

·        N. Andrei,
Scaled conjugate gradient algorithms for Unconstrained Optimization.

Computational Optimization and Applications, vol.38, no. 3, September 2007, pp. 401-416

·        N. Andrei,
An acceleration of gradient descent algorithm with backtracking for Unconstrained Optimization.

Numerical Algorithms, volume 42, number 1, May 2006, pp. 63-73.

Neculai Andrei,  August 2021