Min Li

ORCID: 0000-0003-4695-7660
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Optimization Algorithms Research
  • Iterative Methods for Nonlinear Equations
  • Optimization and Variational Analysis
  • Matrix Theory and Algorithms
  • Fractional Differential Equations Solutions
  • Sparse and Compressive Sensing Techniques
  • Advanced Numerical Methods in Computational Mathematics
  • Optimization and Mathematical Programming
  • Transportation Planning and Optimization
  • Advanced Control Systems Optimization
  • Advanced Mathematical Theories
  • Advanced Vision and Imaging
  • Polynomial and algebraic computation
  • Electromagnetic Simulation and Numerical Methods
  • Advanced Measurement and Metrology Techniques
  • Evaluation and Optimization Models
  • Robotics and Sensor-Based Localization
  • Stochastic processes and financial applications
  • Numerical methods for differential equations
  • Web Applications and Data Management
  • Numerical methods in inverse problems
  • Fluid Dynamics and Turbulent Flows

Huaihua University
2007-2024

Hunan University
2023-2024

Nanjing University
2019-2021

China University of Geosciences
2021

East China University of Science and Technology
2015

Southeast University
2012

Xiangtan University
2007

This paper proposes a hybrid LQP-based method (LQP, logarithmic-quadratic proximal) to solve class of structured variational inequalities. In this method, an intermediate point is produced by solving nonlinear equation system based on the LQP method; descent direction constructed using iterate and new obtained convex combination previous one generated projection-type along direction. Global convergence proved under mild assumptions. Preliminary numerical results for traffic equilibrium...

10.1080/00207160.2012.688822 article EN International Journal of Computer Mathematics 2012-05-22

In this paper, we propose a new nonlinear conjugate gradient method, which generates search direction close to that of the memoryless BFGS quasi-Newton method. With exact line search, our method will reduce standard Hestense-Stiefel Moreover, for any and constant , satisfies descent condition . We establish global convergence strongly convex objective function with Wolfe modify scheme slightly guarantee general nonconvex problem. Numerical results show proposed is efficient unconstrained...

10.1080/10556788.2017.1325885 article EN Optimization methods & software 2017-05-19

Abstract An algorithm for solving nonlinear monotone equations is proposed, which combines a modified Liu-Storey conjugate gradient method with hyperplane projection method. Under mild conditions, the global convergence of proposed established suitable line search The can be applied to solve large-scale problems its lower storage requirement. Numerical results indicate that our efficient. Keywords: Global convergenceMonotone equationsProjection method2000 Mathematics Subject Classification:...

10.1080/01630563.2013.812656 article EN Numerical Functional Analysis and Optimization 2013-06-21

An extended Polak-Ribière-Polyak conjugate gradient method for solving nonlinear systems of equations is proposed, which suitable to large-scale problems the lower storage requirement. Under some mild conditions, global convergence proposed established. It shown from numerical results that practically effective.

10.1080/10556788.2013.816306 article EN Optimization methods & software 2013-06-27

10.1007/s40314-013-0064-0 article EN Computational and Applied Mathematics 2013-08-10

In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property obtained with nonmonotone line search. The proposed suitable problems for low-storage requirement. Numerical experiment results show that new practically effective.

10.1155/2020/6854501 article EN Mathematical Problems in Engineering 2020-10-28

In this paper, we focus on the primal-dual hybrid gradient (PDHG) method, which is being widely used to solve a broad spectrum of saddle-point problems. Despite its wide applications in different areas, study inexact versions PDHG still seems be infancy. We investigate how design implementable inexactness criteria for solving subproblems scheme so that convergence an can guaranteed. propose two specific and accordingly some methods The both rigorously proved, their rates are estimated under...

10.1142/s0217595921500445 article EN Asia Pacific Journal of Operational Research 2021-09-19

10.1016/j.cam.2021.113870 article EN Journal of Computational and Applied Mathematics 2021-10-25

In this paper, a modified Polak–Ribi'ere–Polyak (MPRP) conjugate gradient method for smooth unconstrained optimization is proposed. This produces at each iteration descent direction, and property independent of the line search adopted. Under standard assumptions, we prove that MPRP using strong Wolfe conditions globally convergent. The results computational experiments are reported show effectiveness proposed method.

10.1080/10556788.2012.755182 article EN Optimization methods & software 2012-12-04

Based on several modified Hestenes–Stiefel and Polak–Ribière–Polyak nonlinear conjugate gradient methods, a family of three term limited memory CG methods are developed. When the current search direction falls into subspace spanned by previous m directions, algorithm branches to optimize objective function using L-BFGS method. We use this strategy avoid potential local loops accelerate convergence. The proposed sufficient descent. steplength is determined Wolfe or Armijo line search, we...

10.1080/10556788.2024.2329591 article EN Optimization methods & software 2024-05-23

In this paper, a modified Liu–Storey conjugate gradient method is proposed. The can generate sufficient descent directions for non-linear unconstrained optimization problems. A global convergence result established when the line search fulfils strong Wolfe conditions. Moreover, -linear rate of methods proved. extensive numerical results show that proposed efficient problems in CUTEr library.

10.1080/02331934.2014.895903 article EN Optimization 2014-03-10

This work presents a filled function method based on the filter technique for global optimization. Filled is one of effective methods nonlinear optimization, since it can effectively find better minimizer. Filter applied to local optimization its excellent numerical results. In order optimize method, employed optimizations in this method. A new proposed first, and then algorithm properties are proved. The results listed at end.

10.1155/2015/245427 article EN cc-by Journal of Applied Mathematics 2015-01-01

Recently, a worst-case <svg style="vertical-align:-2.3205pt;width:45.900002px;" id="M1" height="15.0875" version="1.1" viewBox="0 0 45.900002 15.0875" width="45.900002" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns="http://www.w3.org/2000/svg"> <g transform="matrix(.017,-0,0,-.017,.062,12.138)"><path id="x1D442" d="M745 361q0 -166 -116 -272t-291 -106q-136 -225.5 83t-89.5 218q0 161 115.5 272t288.5 111q137 227.5 -83t90.5 -223zM643 359q0 126 -56 199.5t-167 73.5q-130 -213.5 -104t-83.5 -246q0...

10.1155/2013/912846 article EN cc-by Abstract and Applied Analysis 2013-01-01

10.1590/s1807-03022012000100004 article EN Computational and Applied Mathematics 2012-01-01

10.1007/s11464-009-0046-0 article EN Frontiers of Mathematics in China 2009-11-12

In this paper, we develop some three term nonlinear conjugate gradient methods based on the Hestenes–Stiefel (HS), Polak–Ribière–Polyak (PRP) and Liu–Storey (LS) methods. The proposed algorithms always generate sufficient descent directions which satisfy [Formula: see text]. When Wolfe or Armijo line search is used, establish global convergence of in a concise way. Moreover, linear rate discussed as well. extensive numerical results show efficiency

10.1142/s0217595923500203 article EN Asia Pacific Journal of Operational Research 2023-05-17

In this article, we investigate the convergence rate of CG-DESCENT method proposed by Hager and Zhang [1 W. H. ( 2005 ). A new conjugate gradient with guaranteed descent an effcient line search . SIAM Journal Optimization 16 : 170 – 192 .[Crossref], [Web Science ®] , [Google Scholar]]. Under reasonable conditions, show that Wolfe will be n-step superlinear even quadratic if some restart technique is used. Some numerical results are also reported to verify theoretical results.

10.1080/01630563.2012.760590 article EN Numerical Functional Analysis and Optimization 2013-05-01
Coming Soon ...