Home |  ENGLISH |  Kontakt |  Impressum |  Anmelden |  KIT


Aus Aifbportal

Wechseln zu: Navigation, Suche

(This page contains COinS metadata)

Exploiting Second Order Information in Computational Multi-objective Evolutionary Optimization

Published: 2007

Buchtitel: Progress in Artificial Intelligence
Ausgabe: 4874
Reihe: Lecture Notes in Computer Science
Seiten: 271-282
Referierte Veröffentlichung

Evolutionary algorithms are efficient population based algorithms for solving multi-objective optimization problems. Recently various authors have discussed the efficacy of combining gradient based classical methods with evolutionary algorithms. This is done since gradient information leads to convergence to Pareto-optimal solutions with a linear convergence rate. However none of existing studies have explored how to exploit second order or Hessian information in evolutionary multi-objective algorithms. Second order information though costly, leads to a quadratic convergence to Pareto-optimal solutions. In this paper, we take Levenberg-Marquardt methods from classical optimization and show two possible ways of hybrid algorithms. These algorithms require gradient and Hessian information which is obtained using finite difference techniques. Computational studies on a number of test problems of varying complexity demonstrate the efficiency of resulting hybrid algorithms in solving a large class of complex multi-objective optimization problems.


Effiziente Algorithmen