Improved Evolutionary Operators for Sparse Large-Scale Multiobjective Optimization Problems

IEEE Transactions on Evolutionary Computation(2023)

引用 2|浏览9
暂无评分
摘要
Many critical science, societal, and engineering fields contain large-scale multiobjective optimization problems (LSMOPs), comprised of many decision variables. However, as the number of decision variables increases, optimization algorithms face exponentially large search spaces, thereby exhibiting a degraded performance. Nonetheless, LSMOPs whose optimal solutions correspond to sparse variable vectors can be solved more efficiently by evolutionary multiobjective optimization (EMO) algorithms. Despite the great recent strides in developing generic EMO algorithms for sparse LSMOPs, there is still room for improvement. Specifically, algorithms still struggle to find convergent and diverse Pareto fronts in an acceptable amount of time when solving sparse LSMOPs with thousands of decision variables. To better solve sparse LSMOPs, we propose a novel set of evolutionary operators to adapt small-scale EMO algorithms for sparse LSMOPs. These simple, novel, and effective operators include varied striped sparse population sampling (VSSPS), sparse simulated binary crossover (S-SBX), and sparse polynomial mutation (S-PM). These operators, combined with NSGA-II, make the proposed S-NSGA-II algorithm. S-NSGA-II runs near-universally faster than existing methods for problems containing up to 6,400 decision variables, while performing as well as or better than contemporary sparse LSMOP algorithms with respect to hypervolume, especially with problems with larger than 5,000 decision variables.
更多
查看译文
关键词
Large-scale multiobjective optimization,Evolutionary Optimization,Sparse Optimization,Population sampling,Crossover,Mutation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要