Abstract
We propose an elementary algorithm for solving a system of linear inequalities A T y>0 or its alternative Ax=0,x≥0,x≠0. Our algorithm is a smooth version of the perceptron and von Neumann algorithms. Our algorithm retains the simplicity of these algorithms but has a significantly improved convergence rate. Our approach also extends to more general conic systems provided a suitable smoothing oracle is available.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alizadeh, F., Goldfarb, D.: Second-order cone programming. Math. Program. 95(1), 3–51 (2003)
Block, H.D.: The perceptron: a model for brain functioning. Rev. Mod. Phys. 34, 123–135 (1962)
Blum, L., Cucker, F., Shub, M., Smale, S.: Complexity and Real Computation.Springer, New York (1998)
Burgisser, P., Cucker, F.: Condition. (Forthcoming).
Cheung, D., Cucker, F.: A new condition number for linear programming. Math. Program. 91, 163–174 (2001)
Dantzig, G.B.: An ε-precise feasible solution to a linear program with a convexity constraint in \({\frac{1} {\epsilon }^{2}}\) iterations independent of problem size. Technical Report, Stanford University (1992)
Dunagan, J., Vempala, S.: A simple polynomial-time rescaling algorithm for solving linear programs. In: Proceedings of the Thirty-Sixth Annual ACM Symposium on Theory of Computing, Chicago, ACM, pp. 315–320 (2004)
Dunagan, J., Vempala, S.: A simple polynomial-time rescaling algorithm for solving linear programs. Math. Program. 114(1), 101–114 (2006)
Epelman, M., Freund, R.M.: Condition number complexity of an elementary algorithm for computing a reliable solution of a conic linear system. Math. Program. 88, 451–485 (2000)
Freund, R., Vera, J.: Condition-based complexity of convex optimization in conic linear form via the ellipsoid algorithm. SIAM J. Optim. 10, 155–176 (1999)
Nemirovski, A.: Prox-method with rate of convergence \(\mathcal{O}(1/t)\) for variational inequalities with Lipschitz-continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15(1), 229–251 (2004)
Nesterov, Y.: A method for unconstrained convex minimization problem with rate of convergence \(\mathcal{O}(1/{k}^{2})\). Doklady AN SSSR (in russian). (English translation. Sov. Math. Dokl.) 269, 543–547 (1983)
Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J. Optim. 16(1), 235–249 (2005)
Novikoff, A.B.J.: On convergence proofs on perceptrons. In: Proceedings of the Symposium on the Mathematical Theory of Automata, New York, vol. XII, pp. 615–622 (1962)
Renegar, J.: Incorporating condition measures into the complexity theory of linear programming. SIAM J. Optim. 5, 506–524 (1995)
Renegar, J.: Linear programming, complexity theory and elementary functional analysis. Math. Program. 70, 279–351 (1995)
Rosenblatt, F.: The perceptron: A probabilistic model for information storage and organization in the brain. Cornell Aeronautical Laboratory. Psychol. Rev. 65(6), 386–408 (1958)
Soheili, N., Peña, J.: A smooth perceptron algorithm. SIAM J. Optim. 22(2), 728-737 (2012)
Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. SIAM J. Optim., University of Washington (2008, Unpublished Manuscript)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Soheili, N., Peña, J. (2013). A Primal–Dual Smooth Perceptron–von Neumann Algorithm. In: Bezdek, K., Deza, A., Ye, Y. (eds) Discrete Geometry and Optimization. Fields Institute Communications, vol 69. Springer, Heidelberg. https://doi.org/10.1007/978-3-319-00200-2_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-00200-2_17
Published:
Publisher Name: Springer, Heidelberg
Print ISBN: 978-3-319-00199-9
Online ISBN: 978-3-319-00200-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)