# Target-Following for Linear Programming

• Benjamin Jansen
Chapter
Part of the Applied Optimization book series (APOP, volume 6)

## Abstract

In Chapter 5 we analyzed a family of primal-dual affine scaling algorithms, using a direction having the property of combining centering and moving towards optimality. As discussed in Section 5.2 the main difference between affine scaling algorithms and path-following algorithms is that in the former the search-direction only depends on the given iterate, while the latter use reference-points (target-points) in the v-space (see Section 2.1). Not using targets may cause the step size to become extremely small if for some reason an iterate comes close to the boundary of the feasible region. This behavior can specifically be observed in the primal-dual affine scaling algorithm of Monteiro et al. [182] that does not contain a centering effect. In efficient primal-dual methods for linear programming (LP), developed by Monteiro and Adler [180], Kojima et al. [139], Lustig et al. [156, 157] and Mehrotra [170] among others, the central path is therefore used to keep the iterates sufficiently away from the boundary. So-called weighted paths have been used with the same objective in e.g., Den Hertog et al. [104], Ding and Li [45] and Mizuno [174]. In this chapter we propose the use of a different path, that, unlike the weighted paths improves the centering along the path. More specifically, such a path may start in any non-central point but is tangential to the central path in the limit. The path can be viewed as a continuous extension of the primal-dual Dikin-affine scaling direction. A path-following algorithm is developed that uses such a path as a guideline to optimality; this will be called theDikin-path-following algorithm. We stress that centering is very important in interior point methods. A sequence of iterates that approximate the central path (in the limit) will generate points converging to the analytic center of the optimal face (Güler and Ye [97]). It is well known that this center is a strictly complementary solution, thereby defining the optimal partition of the problem, which is very useful in sensitivity analysis (see Chapter 2). Also, the asymptotic analysis of certain interior point methods uses the centering to prove superlinear or quadratic convergence of algorithms, e.g., [88, 89].

## Keywords

Outer Iteration Interior Point Method Central Path Quadratic Convergence Proximity Measure
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.