Reference Work Entry

Encyclopedia of Algorithms

pp 2014-2017

Date:

Smoothed Analysis

  • Heiko RöglinAffiliated withDepartment of Computer Science, University of Bonn Email author 

Keywords

Computational complexity Linear programming Probabilistic analysis

Years and Authors of Summarized Original Work

  • 2001; Spielman, Teng

  • 2004; Beier, Vöcking

Problem Definition

Smoothed analysis has originally been introduced by Spielman and Teng [22] in 2001 to explain why the simplex method is usually fast in practice despite its exponential worst-case running time. Since then it has been applied to a wide range of algorithms and optimization problem. In smoothed analysis, inputs are generated in two steps: first, an adversary chooses an arbitrary instance, and then this instance is slightly perturbed at random. The smoothed performance of an algorithm is defined to be the worst expected performance the adversary can achieve. This model can be viewed as a less pessimistic worst-case analysis, in which the randomness rules out pathological worst-case instances that are rarely observed in practice but dominate the worst-case analysis. If the smoothed running time of an algorithm is low (i.e., the algorithm is efficient in expectation on any perturbed instanc ...

This is an excerpt from the content