Foundations of Computational Mathematics

, Volume 18, Issue 5, pp 1131–1198 | Cite as

A Geometric Analysis of Phase Retrieval

  • Ju Sun
  • Qing Qu
  • John Wright


Can we recover a complex signal from its Fourier magnitudes? More generally, given a set of m measurements, \(y_k = \left| \varvec{a}_k^* \varvec{x} \right| \) for \(k = 1, \ldots , m\), is it possible to recover \(\varvec{x} \in \mathbb C^n\) (i.e., length-n complex vector)? This generalized phase retrieval (GPR) problem is a fundamental task in various disciplines and has been the subject of much recent investigation. Natural nonconvex heuristics often work remarkably well for GPR in practice, but lack clear theoretic explanations. In this paper, we take a step toward bridging this gap. We prove that when the measurement vectors \(\varvec{a}_k\)’s are generic (i.i.d. complex Gaussian) and numerous enough (\(m \ge C n \log ^3 n\)), with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) There are no spurious local minimizers, and all global minimizers are equal to the target signal \(\varvec{x}\), up to a global phase, and (2) the objective function has a negative directional curvature around each saddle point. This structure allows a number of iterative optimization methods to efficiently find a global minimizer, without special initialization. To corroborate the claim, we describe and analyze a second-order trust-region algorithm.


Phase retrieval Nonconvex optimization Function landscape Second-order geometry Ridable saddles Trust-region method Inverse problems Mathematical imaging 

Mathematics Subject Classification

94A12 65K05 90C26 49K45 11D09 



This work was partially supported by funding from the Gordon and Betty Moore Foundation, the Alfred P. Sloan Foundation, and the Grants ONR N00014-13-1-0492, NSF CCF 1527809 and NSF IIS 1546411. We thank Nicolas Boumal for helpful discussion related to the Manopt package. We thank Mahdi Soltanolkotabi for pointing us to his early result on the local convexity around the target set for GPR in \(\mathbb R^n\). We also thank Yonina Eldar, Kishore Jaganathan and Xiaodong Li for helpful feedback on a prior version of this paper. We also thank the anonymous reviewers for their careful reading of the paper and for constructive comments which have helped us to substantially improve the presentation.


  1. 1.
    Pierre-Antoine Absil, Christopher G. Baker, and Kyle A. Gallivan. Trust-region methods on Riemannian manifolds. Foundations of Computational Mathematics, 7(3):303–330, 2007.MathSciNetCrossRefGoogle Scholar
  2. 2.
    Pierre-Antoine. Absil, Robert Mahoney, and Rodolphe Sepulchre. Optimization Algorithms on Matrix Manifolds. Princeton University Press, 2009.Google Scholar
  3. 3.
    Alekh Agarwal, Animashree Anandkumar, Prateek Jain, Praneeth Netrapalli, and Rashish Tandon. Learning sparsely used overcomplete dictionaries via alternating minimization. arXiv preprint arXiv:1310.7991, 2013.
  4. 4.
    Alekh Agarwal, Animashree Anandkumar, and Praneeth Netrapalli. Exact recovery of sparsely used overcomplete dictionaries. arXiv preprint arXiv:1309.1952, 2013.
  5. 5.
    Boris Alexeev, Afonso S. Bandeira, Matthew Fickus, and Dustin G. Mixon. Phase retrieval with polarization. SIAM Journal on Imaging Sciences, 7(1):35–66, 2014.CrossRefGoogle Scholar
  6. 6.
    Anima Anandkumar and Rong Ge. Efficient approaches for escaping higher order saddle points in non-convex optimization. arXiv preprint arXiv:1602.05908, 2016.
  7. 7.
    Animashree Anandkumar, Rong Ge, and Majid Janzamin. Analyzing tensor power method dynamics: Applications to learning overcomplete latent variable models. arXiv preprint arXiv:1411.1488, 2014.
  8. 8.
    Animashree Anandkumar, Rong Ge, and Majid Janzamin. Guaranteed non-orthogonal tensor decomposition via alternating rank-1 updates. arXiv preprint arXiv:1402.5180, 2014.
  9. 9.
    Animashree Anandkumar, Prateek Jain, Yang Shi, and Uma Naresh Niranjan. Tensor vs matrix methods: Robust tensor decomposition under block sparse perturbations. arXiv preprint arXiv:1510.04747, 2015.
  10. 10.
    Sanjeev Arora, Aditya Bhaskara, Rong Ge, and Tengyu Ma. More algorithms for provable dictionary learning. arXiv preprint arXiv:1401.0579, 2014.
  11. 11.
    Sanjeev Arora, Rong Ge, Tengyu Ma, and Ankur Moitra. Simple, efficient, and neural algorithms for sparse coding. arXiv preprint arXiv:1503.00778, 2015.
  12. 12.
    Sanjeev Arora, Rong Ge, and Ankur Moitra. New algorithms for learning incoherent and overcomplete dictionaries. arXiv preprint arXiv:1308.6273, 2013.
  13. 13.
    Sohail Bahmani and Justin Romberg. Phase retrieval meets statistical learning theory: A flexible convex relaxation. arXiv preprint arXiv:1610.04210, 2016.
  14. 14.
    Radu Balan, Bernhard G. Bodmann, Peter G. Casazza, and Dan Edidin. Painless reconstruction from magnitudes of frame coefficients. Journal of Fourier Analysis and Applications, 15(4):488–501, 2009.MathSciNetCrossRefGoogle Scholar
  15. 15.
    Radu V. Balan. On signal reconstruction from its spectrogram. In Information Sciences and Systems (CISS), 44th Annual Conference on, pp. 1–4. IEEE, 2010.Google Scholar
  16. 16.
    Radu Balana, Pete Casazzab, and Dan Edidin. On signal reconstruction without phase. Applied and Computational Harmonic Analysis, 20(3):345 – 356, 2006.MathSciNetCrossRefGoogle Scholar
  17. 17.
    Afonso S. Bandeira, Nicolas Boumal, and Vladislav Voroninski. On the low-rank approach for semidefinite programs arising in synchronization and community detection. arXiv preprint arXiv:1602.04426, 2016.
  18. 18.
    Tamir Bendory and Yonina C. Eldar. Non-convex phase retrieval from STFT measurements. arXiv preprint arXiv:1607.08218, 2016.
  19. 19.
    Dimitri P. Bertsekas. Nonlinear programming. 1999.Google Scholar
  20. 20.
    Srinadh Bhojanapalli, Behnam Neyshabur, and Nathan Srebro. Global optimality of local search for low rank matrix recovery. arXiv preprint arXiv:1605.07221, 2016.
  21. 21.
    Stéphane Boucheron, Gábor Lugosi, and Pascal Massart. Concentration inequalities: A nonasymptotic theory of independence. Oxford University Press, Oxford, 2013.Google Scholar
  22. 22.
    Nicolas Boumal. Nonconvex phase synchronization. arXiv preprint arXiv:1601.06114, 2016.
  23. 23.
    Nicolas Boumal, P-A Absil, and Coralia Cartis. Global rates of convergence for nonconvex optimization on manifolds. arXiv preprint arXiv:1605.08101, 2016.
  24. 24.
    Nicolas Boumal, Bamdev Mishra, P.-A. Absil, and Rodolphe Sepulchre. Manopt, a Matlab toolbox for optimization on manifolds. Journal of Machine Learning Research, 15:1455–1459, 2014.zbMATHGoogle Scholar
  25. 25.
    Nicolas Boumal, Vladislav Voroninski, and Afonso S. Bandeira. The non-convex burer-monteiro approach works on smooth semidefinite programs. arXiv preprint arXiv:1606.04970, 2016.
  26. 26.
    Oliver Bunk, Ana Diaz, Franz Pfeiffer, Christian David, Bernd Schmitt, Dillip K. Satapathy, and J. Friso van der Veen. Diffractive imaging for periodic samples: retrieving one-dimensional concentration profiles across microfluidic channels. Acta Crystallographica Section A, 63(4):306–314, Jul. 2007.Google Scholar
  27. 27.
    T. Tony Cai, Xiaodong Li, and Zongming Ma. Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow. arXiv preprint arXiv:1506.03382, 2015.
  28. 28.
    Emmanuel J. Candès, Yonina C. Eldar, Thomas Strohmer, and Vladislav Voroninski. Phase retrieval via matrix completion. SIAM Journal on Imaging Sciences, 6(1), 2013.Google Scholar
  29. 29.
    Emmanuel J. Candès and Xiaodong Li. Solving quadratic equations via phaselift when there are about as many equations as unknowns. Foundations of Computational Mathematics, 14(5):1017–1026, 2014.MathSciNetCrossRefGoogle Scholar
  30. 30.
    Emmanuel J. Candès, Xiaodong Li, and Mahdi Soltanolkotabi. Phase retrieval from coded diffraction patterns. Applied and Computational Harmonic Analysis, 39(2):277–299, 2015.MathSciNetCrossRefGoogle Scholar
  31. 31.
    Emmanuel J. Candès, Xiaodong Li, and Mahdi Soltanolkotabi. Phase retrieval via wirtinger flow: Theory and algorithms. Information Theory, IEEE Transactions on, 61(4):1985–2007, April 2015.MathSciNetCrossRefGoogle Scholar
  32. 32.
    Emmanuel J. Candès, Thomas Strohmer, and Vladislav Voroninski. Phaselift: Exact and stable signal recovery from magnitude measurements via convex programming. Communications on Pure and Applied Mathematics, 66(8):1241–1274, 2013.MathSciNetCrossRefGoogle Scholar
  33. 33.
    Coralia Cartis, Nicholas IM Gould, and Ph L Toint. Complexity bounds for second-order optimality in unconstrained optimization. Journal of Complexity, 28(1):93–108, 2012.MathSciNetCrossRefGoogle Scholar
  34. 34.
    Anwei Chai, Miguel Moscoso, and George Papanicolaou. Array imaging using intensity-only measurements. Inverse Problems, 27(1):015005, 2011.MathSciNetCrossRefGoogle Scholar
  35. 35.
    Yudong Chen and Martin J. Wainwright. Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees. arXiv preprint arXiv:1509.03025, 2015.
  36. 36.
    Yuxin Chen and Emmanuel J. Candès. Solving random quadratic systems of equations is nearly as easy as solving linear systems. arXiv preprint arXiv:1505.05114, 2015.
  37. 37.
    Andrew R. Conn, Nicholas I.M. Gould, and Philippe L. Toint. Trust region methods, volume 1. SIAM, 2000.Google Scholar
  38. 38.
    John V. Corbett. The pauli problem, state reconstruction and quantum-real numbers. Reports on Mathematical Physics, 57(1):53–68, 2006.MathSciNetCrossRefGoogle Scholar
  39. 39.
    Chris Dainty and James R. Fienup. Phase retrieval and image reconstruction for astronomy. Image Recovery: Theory and Application, pages 231–275, 1987.Google Scholar
  40. 40.
    Armin Eftekhari and Michael B. Wakin. Greed is super: A fast algorithm for super-resolution. arXiv preprint arXiv:1511.03385, 2015.
  41. 41.
    James R. Fienup. Phase retrieval algorithms: a comparison. Applied Optics, 21(15):2758–2769, Aug 1982.CrossRefGoogle Scholar
  42. 42.
    Charles Fortin and Henry Wolkowicz. The trust region subproblem and semidefinite programming. Optimization methods and software, 19(1):41–67, 2004.MathSciNetCrossRefGoogle Scholar
  43. 43.
    Bing Gao and Zhiqiang Xu. Gauss-newton method for phase retrieval. arXiv preprint arXiv:1606.08135, 2016.
  44. 44.
    Rong Ge, Furong Huang, Chi Jin, and Yang Yuan. Escaping from saddle points—online stochastic gradient for tensor decomposition. In Proceedings of The 28th Conference on Learning Theory, pages 797–842, 2015.Google Scholar
  45. 45.
    Rong Ge, Jason D. Lee, and Tengyu Ma. Matrix completion has no spurious local minimum. arXiv preprint arXiv:1605.07272, 2016.
  46. 46.
    R. W. Gerchberg and W. Owen Saxton. A practical algorithm for the determination of the phase from image and diffraction plane pictures. Optik, 35:237–246, 1972.Google Scholar
  47. 47.
    Donald Goldfarb. Curvilinear path steplength algorithms for minimization which use directions of negative curvature. Mathematical programming, 18(1):31–40, 1980.MathSciNetCrossRefGoogle Scholar
  48. 48.
    Tom Goldstein and Christoph Studer. Phasemax: Convex phase retrieval via basis pursuit. arXiv preprint arXiv:1610.07531, 2016.
  49. 49.
    David Gross, Felix Krahmer, and Richard Kueng. A partial derandomization of phaselift using spherical designs. arXiv preprint arXiv:1310.2267, 2013.
  50. 50.
    Paul Hand and Vladislav Voroninski. Compressed sensing from phaseless gaussian measurements via linear programming in the natural parameter space. arXiv preprint arXiv:1611.05985, 2016.
  51. 51.
    Paul Hand and Vladislav Voroninski. An elementary proof of convex phase retrieval in the natural parameter space via the linear program phasemax. arXiv preprint arXiv:1611.03935, 2016.
  52. 52.
    Moritz Hardt. Understanding alternating minimization for matrix completion. In Foundations of Computer Science (FOCS), 2014 IEEE 55th Annual Symposium on, pages 651–660. IEEE, 2014.Google Scholar
  53. 53.
    Moritz Hardt and Mary Wootters. Fast matrix completion without the condition number. In Proceedings of The 27th Conference on Learning Theory, pages 638–678, 2014.Google Scholar
  54. 54.
    Teiko Heinosaari, Luca Mazzarella, and Michael M. Wolf. Quantum tomography under prior information. Communications in Mathematical Physics, 318(2):355–374, 2013.MathSciNetCrossRefGoogle Scholar
  55. 55.
    Samuel B. Hopkins, Tselil Schramm, Jonathan Shi, and David Steurer. Speeding up sum-of-squares for tensor decomposition and planted sparse vectors. arXiv preprint arXiv:1512.02337, 2015.
  56. 56.
    Kishore Jaganathan, Yonina C. Eldar, and Babak Hassibi. Phase retrieval: An overview of recent developments. arXiv preprint arXiv:1510.07713, 2015.
  57. 57.
    Kishore Jaganathan, Samet Oymak, and Babak Hassibi. Sparse phase retrieval: Convex algorithms and limitations. In Proceedings of IEEE International Symposium on Information Theory, pages 1022–1026. IEEE, 2013.Google Scholar
  58. 58.
    Prateek Jain, Chi Jin, Sham M. Kakade, and Praneeth Netrapalli. Computing matrix squareroot via non convex local search. arXiv preprint arXiv:1507.05854, 2015.
  59. 59.
    Prateek Jain and Praneeth Netrapalli. Fast exact matrix completion with finite samples. arXiv preprint arXiv:1411.1087, 2014.
  60. 60.
    Prateek Jain, Praneeth Netrapalli, and Sujay Sanghavi. Low-rank matrix completion using alternating minimization. In Proceedings of the forty-fifth annual ACM symposium on Theory of Computing, pages 665–674. ACM, 2013.Google Scholar
  61. 61.
    Prateek Jain and Sewoong Oh. Provable tensor factorization with missing data. In Advances in Neural Information Processing Systems, pages 1431–1439, 2014.Google Scholar
  62. 62.
    Kenji Kawaguchi. Deep learning without poor local minima. arXiv preprint arXiv:1605.07110, 2016.
  63. 63.
    Raghunandan H. Keshavan, Andrea Montanari, and Sewoong Oh. Matrix completion from a few entries. Information Theory, IEEE Transactions on, 56(6):2980–2998, 2010.MathSciNetCrossRefGoogle Scholar
  64. 64.
    Ritesh Kolte and Ayfer Özgür. Phase retrieval via incremental truncated wirtinger flow. arXiv preprint arXiv:1606.03196, 2016.
  65. 65.
    Ken Kreutz-Delgado. The complex gradient operator and the \(\mathbb{C}\mathbb{R}\)-calculus. arXiv preprint arXiv:0906.4835, 2009.
  66. 66.
    Jason D Lee, Max Simchowitz, Michael I Jordan, and Benjamin Recht. Gradient descent converges to minimizers. arXiv preprint arXiv:1602.04915, 2016.
  67. 67.
    Kiryung Lee and Marius Junge. RIP-like properties in subsampled blind deconvolution. arXiv preprint arXiv:1511.06146, 2015.
  68. 68.
    Kiryung Lee, Yanjun Li, Marius Junge, and Yoram Bresler. Blind recovery of sparse signals from subsampled convolution. arXiv preprint arXiv:1511.06149, 2015.
  69. 69.
    Kiryung Lee, Yihong Wu, and Yoram Bresler. Near optimal compressed sensing of sparse rank-one matrices via sparse power factorization. arXiv preprint arXiv:1312.0525, 2013.
  70. 70.
    Xiaodong Li and Vladislav Voroninski. Sparse signal recovery from quadratic measurements via convex programming. SIAM Journal on Mathematical Analysis, 45(5):3019–3033, 2013.MathSciNetCrossRefGoogle Scholar
  71. 71.
    Jianwei Miao, Tetsuya Ishikawa, Bart Johnson, Erik H. Anderson, Barry Lai, and Keith O. Hodgson. High resolution 3D X-Ray diffraction microscopy. Phys. Rev. Lett., 89(8):088303, Aug 2002.CrossRefGoogle Scholar
  72. 72.
    R. P. Millane. Phase retrieval in crystallography and optics. Journal of the Optical Society of America A, 7(3):394–411, Mar 1990.CrossRefGoogle Scholar
  73. 73.
    Jorge J. Moré and Danny C. Sorensen. Computing a trust region step. SIAM Journal on Scientific and Statistical Computing, 4(3):553–572, 1983.MathSciNetCrossRefGoogle Scholar
  74. 74.
    Cun Mu, Bo Huang, John Wright, and Donald Goldfarb. Square deal: Lower bounds and improved convex relaxations for tensor recovery. Journal of Machine Learning Research, 1:1–48, 2014.Google Scholar
  75. 75.
    Yurii Nesterov and Boris T. Polyak. Cubic regularization of newton method and its global performance. Mathematical Programming, 108(1):177–205, 2006.MathSciNetCrossRefGoogle Scholar
  76. 76.
    Praneeth Netrapalli, Prateek Jain, and Sujay Sanghavi. Phase retrieval using alternating minimization. In Advances in Neural Information Processing Systems, pages 2796–2804, 2013.Google Scholar
  77. 77.
    Praneeth Netrapalli, Uma Naresh. Niranjan, Sujay Sanghavi, Animashree Anandkumar, and Prateek Jain. Non-convex robust PCA. In Advances in Neural Information Processing Systems, pages 1107–1115, 2014.Google Scholar
  78. 78.
    Jorge Nocedal and Stephen Wright. Numerical optimization. Springer Science & Business Media, Berlin, UK,2006.Google Scholar
  79. 79.
    Henrik Ohlsson, Allen Y. Yang, Roy Dong, and S. Shankar Sastry. CPRL – An extension of compressive sensing to the phase retrieval problem. In Advances in Neural Information Processing Systems. 2012.Google Scholar
  80. 80.
    Henrik Ohlsson, Allen Y. Yang, Roy Dong, and S. Shankar Sastry. Compressive phase retrieval from squared output measurements via semidefinite programming. arXiv preprint arXiv:1111.6323, 2013.
  81. 81.
    Henrik Ohlsson, Allen Y. Yang, Michel Verhaegen, and S. Shankar Sastry. Quadratic basis pursuit. arXiv preprint arXiv:1301.7002, 2013.
  82. 82.
    Samet Oymak, Amin Jalali, Maryam Fazel, Yonina C. Eldar, and Babak Hassibi. Simultaneously structured models with application to sparse and low-rank matrices. arXiv preprint arXiv:1212.3753, 2012.
  83. 83.
    Ioannis Panageas and Georgios Piliouras. Gradient descent only converges to minimizers: Non-isolated critical points and invariant regions. CoRR, vol. abs/1605.00405, 2016.Google Scholar
  84. 84.
    Dohyung Park, Anastasios Kyrillidis, Constantine Caramanis, and Sujay Sanghavi. Non-square matrix sensing without spurious local minima via the burer-monteiro approach. arXiv preprint arXiv:1609.03240, 2016.
  85. 85.
    Qing Qu, Ju Sun, and John Wright. Finding a sparse vector in a subspace: Linear sparsity using alternating directions. In Advances in Neural Information Processing Systems, pages 3401–3409, 2014.Google Scholar
  86. 86.
    H. Reichenbach. In Philosophic foundations of quantum mechanics. University of California Press, 1965.Google Scholar
  87. 87.
    Franz Rendl and Henry Wolkowicz. A semidefinite framework for trust region subproblems with applications to large scale minimization. Mathematical Programming, 77(1):273–299, 1997.MathSciNetCrossRefGoogle Scholar
  88. 88.
    W. Harrison Robert. Phase problem in crystallography. Journal of the Optical Society of America A, 10(5):1046–1055, 1993.CrossRefGoogle Scholar
  89. 89.
    Christopher De Sa, Christopher Re, and Kunle Olukotun. Global convergence of stochastic gradient descent for some non-convex matrix problems. In The 32nd International Conference on Machine Learning, volume 37, pages 2332–2341, 2015.Google Scholar
  90. 90.
    Hanie Sedghi and Animashree Anandkumar. Provable tensor methods for learning mixtures of classifiers. arXiv preprint arXiv:1412.3046, 2014.
  91. 91.
    Yoav Shechtman, Amir Beck, and Yonina C. Eldar. GESPAR: Efficient phase retrieval of sparse signals. Signal Processing, IEEE Transactions on, 62(4):928–938, Feb 2014.MathSciNetCrossRefGoogle Scholar
  92. 92.
    Yoav Shechtman, Yonina C. Eldar, Oren Cohen, Henry N. Chapman, Jianwei Miao, and Mordechai Segev. Phase retrieval with application to optical imaging: A contemporary overview. Signal Processing Magazine, IEEE, 32(3):87–109, May 2015.CrossRefGoogle Scholar
  93. 93.
    Mahdi Soltanolkotabi. Algorithms and theory for clustering and nonconvex quadratic programming. PhD thesis, Stanford University, 2014.Google Scholar
  94. 94.
    Daniel Soudry and Yair Carmon. No bad local minima: Data independent training error guarantees for multilayer neural networks. arXiv preprint arXiv:1605.08361, 2016.
  95. 95.
    Gilbert W. Stewart and Ji-guang Sun. Matrix Perturbation Theory. Academic press, Cambridge, 1990.Google Scholar
  96. 96.
    Ju Sun, Qing Qu, and John Wright. Complete dictionary recovery over the sphere. arXiv preprint arXiv:1504.06785, 2015.
  97. 97.
    Ju Sun, Qing Qu, and John Wright. When are nonconvex problems not scary? arXiv preprint arXiv:1510.06096, 2015.
  98. 98.
    Ruoyu Sun and Zhi-Quan Luo. Guaranteed matrix completion via non-convex factorization. arXiv preprint arXiv:1411.8003, 2014.
  99. 99.
    Stephen Tu, Ross Boczar, Mahdi Soltanolkotabi, and Benjamin Recht. Low-rank solutions of linear matrix equations via procrustes flow. arXiv preprint arXiv:1507.03566, 2015.
  100. 100.
    Stephen A. Vavasis and Richard Zippel. Proving polynomial-time for sphere-constrained quadratic programming. Technical report, Cornell University, 1990.Google Scholar
  101. 101.
    Roman Vershynin. Introduction to the non-asymptotic analysis of random matrices. In Yonina C. Eldar and Gitta Kutyniok, editors, Compressed Sensing, pages 210–268. Cambridge University Press, 2012.Google Scholar
  102. 102.
    Vladislav Voroninski and Zhiqiang Xu. A strong restricted isometry property, with an application to phaseless compressed sensing. arXiv preprint arXiv:1404.3811, 2014.
  103. 103.
    Irène Waldspurger. Phase retrieval with random gaussian sensing vectors by alternating projections. arXiv preprint arXiv:1609.03088, 2016.
  104. 104.
    Iréne Waldspurger, Alexandre d Aspremont, and Stéphane Mallat. Phase recovery, maxcut and complex semidefinite programming. Mathematical Programming, 149(1-2):47–81, 2015.MathSciNetCrossRefGoogle Scholar
  105. 105.
    Adriaan Walther. The question of phase retrieval in optics. Journal of Modern Optics, 10(1):41–49, 1963.MathSciNetGoogle Scholar
  106. 106.
    Gang Wang, Georgios B Giannakis, and Yonina C Eldar. Solving systems of random quadratic equations via truncated amplitude flow. arXiv preprint, 2016.Google Scholar
  107. 107.
    Ke Wei, Jian-Feng Cai, Tony F. Chan, and Shingyu Leung. Guarantees of Riemannian optimization for low rank matrix recovery. arXiv preprint arXiv:1511.01562, 2015.
  108. 108.
    Chris D. White, Rachel Ward, and Sujay Sanghavi. The local convexity of solving quadratic equations. arXiv preprint arXiv:1506.07868, 2015.
  109. 109.
    Yinyu Ye. On affine scaling algorithms for nonconvex quadratic programming. Mathematical Programming, 56(1-3):285–300, 1992.MathSciNetCrossRefGoogle Scholar
  110. 110.
    Xinyang Yi, Constantine Caramanis, and Sujay Sanghavi. Alternating minimization for mixed linear regression. arXiv preprint arXiv:1310.3745, 2013.
  111. 111.
    Huishuai Zhang, Yuejie Chi, and Yingbin Liang. Provable non-convex phase retrieval with outliers: Median truncated wirtinger flow. arXiv preprint arXiv:1603.03805, 2016.
  112. 112.
    Huishuai Zhang and Yingbin Liang. Reshaped wirtinger flow for solving quadratic systems of equations. arXiv preprint arXiv:1605.07719, 2016.
  113. 113.
    Qinqing Zheng and John Lafferty. A convergent gradient descent algorithm for rank minimization and semidefinite programming from random linear measurements. arXiv preprint arXiv:1506.06081, 2015.

Copyright information

© SFoCM 2017

Authors and Affiliations

  1. 1.Department of MathematicsStanford UniversityStanfordUSA
  2. 2.Department of Electrical EngineeringColumbia UniversityNew York CityUSA

Personalised recommendations