Advertisement

On Handling a Large Number of Objectives A Posteriori and During Optimization

  • Dimo Brockhoff
  • Dhish Kumar Saxena
  • Kalyanmoy Deb
  • Eckart Zitzler
Part of the Natural Computing Series book series (NCS)

Summary

Dimensionality reduction methods are used routinely in statistics, pattern recognition, data mining, and machine learning to cope with high-dimensional spaces. Also in the case of high-dimensional multiobjective optimization problems, a reduction of the objective space can be beneficial both for search and decision making. New questions arise in this context, e.g., how to select a subset of objectives while preserving most of the problem structure. In this chapter, two different approaches to the task of objective reduction are developed, one based on assessing explicit conflicts, the other based on principal component analysis (PCA). Although both methods use different principles and preserve different properties of the underlying optimization problems, they can be effectively utilized either in an a posteriori scenario or during search. Here, we demonstrate the usability of the conflict-based approach in a decision-making scenario after the search and show how the principal-component-based approach can be integrated into an evolutionary multicriterion optimization (EMO) procedure.

Keywords

Greedy Algorithm Multiobjective Optimization Problem Nondominated Solution Dimensionality Reduction Method Dominance Structure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    The Journal of Machine Learning Research: Special Issue on Variable and Feature Selection. MIT Press, 2003.Google Scholar
  2. [2]
    P. J. Agrell. On redundancy in multi criteria decision making. European Journal of Operational Research, 98(3):571–586, 1997.zbMATHCrossRefGoogle Scholar
  3. [3]
    N. Beume and G. Rudolph. Faster S-Metric Calculation by Considering Dominated Hypervolume as Klee’s Measure Problem. Technical Report CI-216/06, Sonderforschungsbereich 531 Computational Intelligence, Universitát Dortmund, July 2006.Google Scholar
  4. [4]
    D. Brockhoff and E. Zitzler. Are All Objectives Necessary? On Dimensionality Reduction in Evolutionary Multiobjective Optimization. In Parallel Problem Solving from Nature (PPSN-IX), volume 4193 of LNCS, pages 533–542. Springer, 2006. ISBN 3-540-38990-3.Google Scholar
  5. [5]
    D. Brockhoff and E. Zitzler. Dimensionality Reduction in Multiobjective Optimization with (Partial) Dominance Structure Preservation: Generalized Minimum Objective Subset Problems. TIK Report 247, Institut für Technische Informatik und Kommunikationsnetze, ETH Zürich, April 2006.Google Scholar
  6. [6]
    D. Brockhoff and E. Zitzler. Dimensionality Reduction in Multiobjective Optimization: The Minimum Objective Subset Problem. In Proceedings of the Operations Research 2006 conference, 2007. to appear.Google Scholar
  7. [7]
    D. Brockhoff and E. Zitzler. Offline and Online Objective Reduction in Evolutionary Multiobjective Optimization Based on Objective Conflicts. TIK Report 269, Institut fúr Technische Informatik und Kommunikationsnetze, ETH Zúrich, Apr. 2007.Google Scholar
  8. [8]
    M. Charikar, V. Guruswami, R. Kumar, S. Rajagopalan, and A. Sahai. Combinatorial feature selection problems. In IEEE Symposium on Foundations of Computer Science, pages 631–640, 2000. URL citeseer.ist.psu.edu/376451.html.Google Scholar
  9. [9]
    C. Coello Coello and A. Hernández Aguirre. Design of Combinational Logic Circuits through an Evolutionary Multiobjective Optimization Approach. Artificial Intelligence for Engineering, Design, Analysis and Manufacture, 16 (1): 39–53, 2002.Google Scholar
  10. [10]
    C. A. Coello Coello, D. A. Van Veldhuizen, and G. B. Lamont. Evolutionary Algorithms for Solving Multi-Objective Problems. Kluwer Academic Publishers, New York, 2002.Google Scholar
  11. [11]
    J. J. Dai, L. Lieu, and D. Rocke. Dimension Reduction for Classification with Gene Expression Microarray Data. Statistical Applications in Genetics and Molecular Biology, 5 (1), 2006.Google Scholar
  12. [12]
    M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1 (3): 131–156, 1997.CrossRefGoogle Scholar
  13. [13]
    K. Deb. Multi-objective optimization using evolutionary algorithms. Wiley, Chichester, UK, 2001.zbMATHGoogle Scholar
  14. [14]
    K. Deb, S. Agrawal, A. Pratap, and T. Meyarivan. A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In Parallel Problem Solving from Nature (PPSN-VI), pages 849–858, 2000.Google Scholar
  15. [15]
    K. Deb and D. Saxena. Searching For Pareto-Optimal Solutions Through Dimensionality Reduction for Certain Large-Dimensional Multi-Objective Optimization Problems. In IEEE Congress on Evolutionary Computation (CEC 2006), pages 3352–3360, 2006.Google Scholar
  16. [16]
    K. Deb, L. Thiele, M. Laumanns, and E. Zitzler. Scalable Test Problems for Evolutionary Multi-Objective Optimization. In A. Abraham, R. Jain, and R. Goldberg, editors, Evolutionary Multiobjective Optimization: Theoretical Advances and Applications, pages 105–145. Springer, 2005. ISBN 1-85233-787-7.Google Scholar
  17. [17]
    M. Ehrgott and S. Nickel. On the number of criteria needed to decide Pareto-optimality. Mathematical Methods of Operations Research, 55(3): 329–345, 2002.zbMATHCrossRefMathSciNetGoogle Scholar
  18. [18]
    M. Emmerich, N. Beume, and B. Naujoks. An EMO Algorithm Using the Hypervolume Measure as Selection Criterion. In Evolutionary Multi-Criterion Optimization (EMO 2005), volume 3410 of LNCS, pages 62–76. Springer, 2005.Google Scholar
  19. [19]
    C. M. Fonseca, L. Paquete, and M. López-Ibáñez. An Improved Dimension-Sweep Algorithm for the Hypervolume Indicator. In IEEE Congress on Evolutionary Computation (CEC 2006), pages 1157–1163, 2006.Google Scholar
  20. [20]
    T. Gal and T. Hanne. Consequences of dropping nonessential objectives for the application of MCDM methods. European Journal of Operational Research, 119: 373–378, 1999.zbMATHCrossRefGoogle Scholar
  21. [21]
    T. Gal and H. Leberling. Redundant objective functions in linear vector maximum problems and their determination. European Journal of Operational Research, 1 (3): 176–184, 1977.zbMATHCrossRefMathSciNetGoogle Scholar
  22. [22]
    T. Goel, R. Vaidyanathan, R. T. Haftka, N. Queipo, W. Shyy, and K. Tucker. Response surface approximation of Pareto-optimal front in multi-objective optimization. Technical report, Department of Mechanical and Aerospace Engineering, University of Florida, USA, 2004.Google Scholar
  23. [23]
    E. J. Hughes. Radar Waveform Optimization as a Many-Objective Application Benchmark. In Evolutionary Multi-Criterion Optimization (EMO 2007), volume 4403 of LNCS, pages 700–714, 2007.Google Scholar
  24. [24]
    A. Hyvárinen, J. Karhunen, and E. Oja. Independent Component Analysis. John Wiley & Sons, 2001.Google Scholar
  25. [25]
    I. T. Jolliffe. Principal component analysis. Springer, 2002. ISBN 0-387-95442-2.Google Scholar
  26. [26]
    P. Langley. Selection of relevant features in machine learning. In Proceedings of the AAAI Fall Symposium on Relevance, pages 140–144, 1994.Google Scholar
  27. [27]
    H. Liu and H. Motoda, editors. Feature Extraction, Construction and Selection: A Data Mining Perspective. Kluwer Academic Publishers, Norwell, MA, USA, 1998.zbMATHGoogle Scholar
  28. [28]
    B. Paechter, R. C. Rankin, A. Cumming, and T. C. Fogarty. Timetabling the Classes of an Entire University with an Evolutionary Algorithm. In Parallel Problem Solving From Nature (PPSN-V), pages 865–874. Springer, 1998.Google Scholar
  29. [29]
    R. C. Purshouse and P. J. Fleming. Conflict, Harmony, and Independence: Relationships in Evolutionary Multi-criterion Optimisation. In Evolutionary Multi-Criterion Optimization (EMO 2003), volume 2632 of LNCS, pages 16–30. Springer, 2003.Google Scholar
  30. [30]
    D. K. Saxena and K. Deb. Non-linear Dimensionality Reduction Procedures for Certain Large-Dimensional Multi-Objective Optimization Problems: Employing Correntropy and a Novel Maximum Variance Unfolding. In Evolutionary Multi-criterion Optimization (EMO 2007), 2007.Google Scholar
  31. [31]
    K. C. Tan, E. F. Khor, and T. H. Lee. Multiobjective Evolutionary Algorithms and Applications. Springer, London, UK, 2005.zbMATHGoogle Scholar
  32. [32]
    H. Vafaie and K. De Jong. Robust feature selection algorithms. In Tools with Artificial Intelligence (TAI ’93), pages 356–363, 1993.Google Scholar
  33. [33]
    T. Wagner, N. Beume, and B. Naujoks. Pareto-, Aggregation-, and Indicator-based Methods in Many-objective Optimization. In Evolutionary Multi-Criterion Optimization (EMO 2007), volume 4403 of LNCS, pages 742–756. Springer, 2007.Google Scholar
  34. [34]
    L. While. A New Analysis of the LebMeasure Algorithm for Calculating Hypervolume. In Evolutionary Multi-Criterion Optimization (EMO 2005), volume 3410 of LNCS, pages 326–340. Springer, 2005.Google Scholar
  35. [35]
    L. While, L. Bradstreet, L. Barone, and P. Hingston. Heuristics for Optimising the Calculation of Hypervolume for Multi-objective Optimisation Problems. In IEEE Congress on Evolutionary Computation (CEC 2005), pages 2225–2232, 2005.Google Scholar
  36. [36]
    E. Zitzler and S. Künzli. Indicator-Based Selection in Multiobjective Search. In Parallel Problem Solving from Nature (PPSN-VIII), September 2004.Google Scholar
  37. [37]
    E. Zitzler, L. Thiele, M. Laumanns, C. M. Foneseca, and V. Grunert da Fonseca. Performance Assessment of Multiobjective Optimizers: An Analysis and Review. IEEE Transactions on Evolutionary Computation, 7 (2): 117–132, 2003.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Dimo Brockhoff
    • 1
  • Dhish Kumar Saxena
    • 2
  • Kalyanmoy Deb
    • 2
  • Eckart Zitzler
    • 1
  1. 1.Computer Engineering and Networks Laboratory (TIK)ETH ZurichSwitzerland
  2. 2.Kanpur Genetic Algorithms Laboratory (KanGAL)Indian Institute of Technology KanpurIndia

Personalised recommendations