Hybrid Optimization Method Based on Coupling Local Gradient Information and Global Evolution Mechanism
evolutionary algorithm
gradient optimization
multi-objective optimization
0203 mechanical engineering
QA1-939
aerodynamic optimization
02 engineering and technology
Mathematics
DOI:
10.3390/math12081234
Publication Date:
2024-04-19T12:44:31Z
AUTHORS (4)
ABSTRACT
Multi-objective evolutionary algorithms (MOEA) have attracted much attention because of their good global exploration ability; however, their local search ability near the optimal value is weak, and for large-scale decision-variable optimization problems the number of populations and iterations required by MOEA are very large, so the optimization efficiency is low. Gradient optimization algorithms can overcome these difficulties well, but gradient search methods are difficult to apply to multi-objective optimization problems (MOPs). To this end, this paper introduces a stochastic weighting function based on the weighted average gradient and proposes two multi-objective stochastic gradient operators. Further, two efficient evolutionary algorithms, MOGBA and HMOEA, are developed. Their local search capability has been greatly enhanced while retaining the good global exploration capability by using different offspring update strategies for different subpopulations. Numerical experiments show that HMOEA has excellent capture ability for various Pareto formations, and it can easily solve multi-objective optimization problems with many objectives, which improves the efficiency by a factor of 5–10 compared with typical multi-objective evolutionary algorithms. HMOEA is further applied to the multi-objective aerodynamic optimization design of the RAE2822 airfoil and the ideal Pareto front is obtained, which indicates that HMOEA is an efficient optimization algorithm with potential applications in aerodynamic optimization design.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (37)
CITATIONS (1)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....