Combinatorial Optimization with Noisy Inputs: How Can We Separate the Wheat from the Chaff?

  • Peter Widmayer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7157)

Abstract

We postulate that real world data are almost always noisy, and an exact solution to a noisy input instance of a combinatorial optimization problem is not what we really want. Noise, or input data uncertainty, has a variety of reasons, such as for instance the need to estimate data based on imprecise measurements or on predictions (drawn from historical data and expected modifications). There is a variety of popular ways to deal with this uncertainty problem. In lucky cases in which the input data distribution is known, one might aim at obtaining a solution that is good in expectation. A different, promising way to handle uncertainty is based on the availability of a discrete set of possible problem instances (sometimes reflecting a distribution), so-called scenarios. A solution must be proposed for a set of scenarios as input, and thereafter a single scenario reveals itself as the actual one. The goal here is to achieve a high quality of the proposed solution with respect to the revealed scenario. Stochastic programming can be used to aim at a good solution in expectation that is feasible for most scenarios. In contrast, robust optimization most often aims at a solution that is feasible in all scenarios and has smallest worst case cost. In any case, uncertainty is considered a curse, a burden, a difficult problem that needs to be dealt with at extra computational cost.

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Peter Widmayer
    • 1
  1. 1.Institute of Theoretical Computer ScienceETH ZürichSwitzerland

Personalised recommendations