TY - JOUR
T1 - A Recursive Decomposition Method for Large Scale Continuous Optimization
AU - Sun, Yuan
AU - Kirley, Michael
AU - Halgamuge, Saman K.
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/10
Y1 - 2018/10
N2 - Cooperative co-evolution (CC) is an evolutionary computation framework that can be used to solve high-dimensional optimization problems via a 'divide-and-conquer' mechanism. However, the main challenge when using this framework lies in problem decomposition. That is, deciding how to allocate decision variables to a particular subproblem, especially interacting decision variables. Existing decomposition methods are typically computationally expensive. In this paper, we propose a new decomposition method, which we call recursive differential grouping (RDG), by considering the interaction between decision variables based on nonlinearity detection. RDG recursively examines the interaction between a selected decision variable and the remaining variables, placing all interacting decision variables into the same subproblem. We use analytical methods to show that RDG can be used to efficiently decompose a problem, without explicitly examining all pairwise variable interactions. We evaluated the efficacy of the RDG method using large scale benchmark optimization problems. Numerical simulation experiments showed that RDG greatly improved the efficiency of problem decomposition in terms of time complexity. Significantly, when RDG was embedded in a CC framework, the optimization results were better than results from seven other decomposition methods.
AB - Cooperative co-evolution (CC) is an evolutionary computation framework that can be used to solve high-dimensional optimization problems via a 'divide-and-conquer' mechanism. However, the main challenge when using this framework lies in problem decomposition. That is, deciding how to allocate decision variables to a particular subproblem, especially interacting decision variables. Existing decomposition methods are typically computationally expensive. In this paper, we propose a new decomposition method, which we call recursive differential grouping (RDG), by considering the interaction between decision variables based on nonlinearity detection. RDG recursively examines the interaction between a selected decision variable and the remaining variables, placing all interacting decision variables into the same subproblem. We use analytical methods to show that RDG can be used to efficiently decompose a problem, without explicitly examining all pairwise variable interactions. We evaluated the efficacy of the RDG method using large scale benchmark optimization problems. Numerical simulation experiments showed that RDG greatly improved the efficiency of problem decomposition in terms of time complexity. Significantly, when RDG was embedded in a CC framework, the optimization results were better than results from seven other decomposition methods.
KW - Continuous optimization problem
KW - cooperative co-evolution (CC)
KW - decomposition method
KW - large scale global optimization (LSGO)
UR - http://www.scopus.com/inward/record.url?scp=85036588611&partnerID=8YFLogxK
U2 - 10.1109/TEVC.2017.2778089
DO - 10.1109/TEVC.2017.2778089
M3 - Article
SN - 1089-778X
VL - 22
SP - 647
EP - 661
JO - IEEE Transactions on Evolutionary Computation
JF - IEEE Transactions on Evolutionary Computation
IS - 5
M1 - 8122017
ER -