- Advanced Optimization Algorithms Research
- Iterative Methods for Nonlinear Equations
- Optimization and Variational Analysis
- Matrix Theory and Algorithms
- Fractional Differential Equations Solutions
- Sparse and Compressive Sensing Techniques
- Advanced Numerical Methods in Computational Mathematics
- Optimization and Mathematical Programming
- Transportation Planning and Optimization
- Advanced Control Systems Optimization
- Advanced Mathematical Theories
- Advanced Vision and Imaging
- Polynomial and algebraic computation
- Electromagnetic Simulation and Numerical Methods
- Advanced Measurement and Metrology Techniques
- Evaluation and Optimization Models
- Robotics and Sensor-Based Localization
- Stochastic processes and financial applications
- Numerical methods for differential equations
- Web Applications and Data Management
- Numerical methods in inverse problems
- Fluid Dynamics and Turbulent Flows
Huaihua University
2007-2024
Hunan University
2023-2024
Nanjing University
2019-2021
China University of Geosciences
2021
East China University of Science and Technology
2015
Southeast University
2012
Xiangtan University
2007
This paper proposes a hybrid LQP-based method (LQP, logarithmic-quadratic proximal) to solve class of structured variational inequalities. In this method, an intermediate point is produced by solving nonlinear equation system based on the LQP method; descent direction constructed using iterate and new obtained convex combination previous one generated projection-type along direction. Global convergence proved under mild assumptions. Preliminary numerical results for traffic equilibrium...
In this paper, we propose a new nonlinear conjugate gradient method, which generates search direction close to that of the memoryless BFGS quasi-Newton method. With exact line search, our method will reduce standard Hestense-Stiefel Moreover, for any and constant , satisfies descent condition . We establish global convergence strongly convex objective function with Wolfe modify scheme slightly guarantee general nonconvex problem. Numerical results show proposed is efficient unconstrained...
Abstract An algorithm for solving nonlinear monotone equations is proposed, which combines a modified Liu-Storey conjugate gradient method with hyperplane projection method. Under mild conditions, the global convergence of proposed established suitable line search The can be applied to solve large-scale problems its lower storage requirement. Numerical results indicate that our efficient. Keywords: Global convergenceMonotone equationsProjection method2000 Mathematics Subject Classification:...
An extended Polak-Ribière-Polyak conjugate gradient method for solving nonlinear systems of equations is proposed, which suitable to large-scale problems the lower storage requirement. Under some mild conditions, global convergence proposed established. It shown from numerical results that practically effective.
In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property obtained with nonmonotone line search. The proposed suitable problems for low-storage requirement. Numerical experiment results show that new practically effective.
In this paper, we focus on the primal-dual hybrid gradient (PDHG) method, which is being widely used to solve a broad spectrum of saddle-point problems. Despite its wide applications in different areas, study inexact versions PDHG still seems be infancy. We investigate how design implementable inexactness criteria for solving subproblems scheme so that convergence an can guaranteed. propose two specific and accordingly some methods The both rigorously proved, their rates are estimated under...
In this paper, a modified Polak–Ribi'ere–Polyak (MPRP) conjugate gradient method for smooth unconstrained optimization is proposed. This produces at each iteration descent direction, and property independent of the line search adopted. Under standard assumptions, we prove that MPRP using strong Wolfe conditions globally convergent. The results computational experiments are reported show effectiveness proposed method.
Based on several modified Hestenes–Stiefel and Polak–Ribière–Polyak nonlinear conjugate gradient methods, a family of three term limited memory CG methods are developed. When the current search direction falls into subspace spanned by previous m directions, algorithm branches to optimize objective function using L-BFGS method. We use this strategy avoid potential local loops accelerate convergence. The proposed sufficient descent. steplength is determined Wolfe or Armijo line search, we...
In this paper, a modified Liu–Storey conjugate gradient method is proposed. The can generate sufficient descent directions for non-linear unconstrained optimization problems. A global convergence result established when the line search fulfils strong Wolfe conditions. Moreover, -linear rate of methods proved. extensive numerical results show that proposed efficient problems in CUTEr library.
This work presents a filled function method based on the filter technique for global optimization. Filled is one of effective methods nonlinear optimization, since it can effectively find better minimizer. Filter applied to local optimization its excellent numerical results. In order optimize method, employed optimizations in this method. A new proposed first, and then algorithm properties are proved. The results listed at end.
Recently, a worst-case <svg style="vertical-align:-2.3205pt;width:45.900002px;" id="M1" height="15.0875" version="1.1" viewBox="0 0 45.900002 15.0875" width="45.900002" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns="http://www.w3.org/2000/svg"> <g transform="matrix(.017,-0,0,-.017,.062,12.138)"><path id="x1D442" d="M745 361q0 -166 -116 -272t-291 -106q-136 -225.5 83t-89.5 218q0 161 115.5 272t288.5 111q137 227.5 -83t90.5 -223zM643 359q0 126 -56 199.5t-167 73.5q-130 -213.5 -104t-83.5 -246q0...
In this paper, we develop some three term nonlinear conjugate gradient methods based on the Hestenes–Stiefel (HS), Polak–Ribière–Polyak (PRP) and Liu–Storey (LS) methods. The proposed algorithms always generate sufficient descent directions which satisfy [Formula: see text]. When Wolfe or Armijo line search is used, establish global convergence of in a concise way. Moreover, linear rate discussed as well. extensive numerical results show efficiency
In this article, we investigate the convergence rate of CG-DESCENT method proposed by Hager and Zhang [1 W. H. ( 2005 ). A new conjugate gradient with guaranteed descent an effcient line search . SIAM Journal Optimization 16 : 170 – 192 .[Crossref], [Web Science ®] , [Google Scholar]]. Under reasonable conditions, show that Wolfe will be n-step superlinear even quadratic if some restart technique is used. Some numerical results are also reported to verify theoretical results.