Skip to main content

Efficient Second-Order Shape-Constrained Function Fitting

  • Conference paper
  • First Online:
Book cover Algorithms and Data Structures (WADS 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11646))

Included in the following conference series:

  • 1034 Accesses

Abstract

We give an algorithm to compute a one-dimensional shape-constrained function that best fits given data in weighted-\(L_{\infty }\) norm. We give a single algorithm that works for a variety of commonly studied shape constraints including monotonicity, Lipschitz-continuity and convexity, and more generally, any shape constraint expressible by bounds on first- and/or second-order differences. Our algorithm computes an approximation with additive error \(\epsilon \) in \(O\left( n \log \frac{U}{\epsilon } \right) \) time, where U captures the range of input values. We also give a simple greedy algorithm that runs in O(n) time for the special case of unweighted \(L_{\infty }\) convex regression. These are the first (near-)linear-time algorithms for second-order-constrained function fitting. To achieve these results, we use a novel geometric interpretation of the underlying dynamic programming problem. We further show that a generalization of the corresponding problems to directed acyclic graphs (DAGs) is as difficult as linear programming.

An extended online version with full proofs is available at arxiv.org/abs/1905.02149.

D. Durfee—Supported in part by National Science Foundation Grant 1718533.

S. Wild—Supported by the Natural Sciences and Engineering Research Council of Canada and the Canada Research Chairs Programme.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Some problems are stated with \(\pm \infty \) values, but we can always replace unbounded values in the algorithms with an (input-specific) sufficiently large finite number.

References

  1. Agarwal, P.K., Phillips, J.M., Sadri, B.: Lipschitz unimodal and isotonic regression on paths and trees. In: López-Ortiz, A. (ed.) LATIN 2010. LNCS, vol. 6034, pp. 384–396. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12200-2_34

    Chapter  Google Scholar 

  2. Aggarwal, A., Klawe, M.M., Moran, S., Shor, P., Wilber, R.: Geometric applications of a matrix-searching algorithm. Algorithmica 2(1–4), 195–208 (1987). https://doi.org/10.1007/bf01840359

    Article  MathSciNet  MATH  Google Scholar 

  3. Bach, F.: Efficient algorithms for non-convex isotonic regression through submodular optimization. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31, pp. 1–10. Curran Associates, Inc. (2018)

    Google Scholar 

  4. Balázs, G.: Convex Regression: Theory, Practice, and Applications. Ph.D. thesis (2016). https://doi.org/10.7939/R3T43J98B

  5. Chazelle, B.: A theorem on polygon cutting with applications. In: Symposium on Foundations of Computer Science (SFCS), pp. 339–349. IEEE (1982). https://doi.org/10.1109/SFCS.1982.58

  6. Eppstein, D., Galil, Z., Giancarlo, R.: Speeding up dynamic programming. In: Symposium on Foundations of Computer Science (SFCS), IEEE (1988). https://doi.org/10.1109/sfcs.1988.21965

  7. Erickson, J.: Shortest homotopic paths (2009). http://jeffe.cs.illinois.edu/teaching/comptop/2009/notes/shortest-homotopic-paths.pdf. lecture notes for computational topology

  8. Fefferman, C.: Smooth interpolation of data by efficient algorithms. In: Excursions in Harmonic Analysis, vol. 1, pp. 71–84. Birkhäuser, Boston, November 2012. https://doi.org/10.1007/978-0-8176-8376-4_4

    Google Scholar 

  9. Fournier, H., Vigneron, A.: Fitting a step function to a point set. Algorithmica 60(1), 95–109 (2009). https://doi.org/10.1007/s00453-009-9342-z

    Article  MathSciNet  MATH  Google Scholar 

  10. Fournier, H., Vigneron, A.: A deterministic algorithm for fitting a step function to a weighted point-set. Inf. Process. Lett. 113(3), 51–54 (2013). https://doi.org/10.1016/j.ipl.2012.11.003

    Article  MathSciNet  MATH  Google Scholar 

  11. Galil, Z., Giancarlo, R.: Speeding up dynamic programming with applications to molecular biology. Theor. Comput. Sci. 64(1), 107–118 (1989). https://doi.org/10.1016/0304-3975(89)90101-1

    Article  MathSciNet  MATH  Google Scholar 

  12. Ganti, R.S., Balzano, L., Willett, R.: Matrix completion under monotonic single index models. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 1873–1881. Curran Associates, Inc. (2015)

    Google Scholar 

  13. Groeneboom, P., Jongbloed, G.: Nonparametric Estimation Under Shape Constraints, vol. 38. Cambridge University Press (2014)

    Google Scholar 

  14. Guntuboyina, A., Sen, B.: Nonparametric shape-restricted regression. Stat. Sci. 33(4), 568–594 (2018). https://doi.org/10.1214/18-sts665

    Article  MathSciNet  MATH  Google Scholar 

  15. Kakade, S.M., Kanade, V., Shamir, O., Kalai, A.: Efficient learning of generalized linear and single index models with isotonic regression. In: Shawe-Taylor, J., Zemel, R.S., Bartlett, P.L., Pereira, F., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 24, pp. 927–935. Curran Associates, Inc. (2011)

    Google Scholar 

  16. Kalai, A.T., Sastry, R.: The isotron algorithm: High-dimensional isotonic regression. In: Annual Conference on Learning Theory (COLT) (2009)

    Google Scholar 

  17. Klein, P.N.: Multiple-source shortest paths in planar graphs. In: Symposium on Discrete Algorithms (SODA), pp. 146–155. SIAM (2005)

    Google Scholar 

  18. Kyng, R., Rao, A., Sachdeva, S.: Fast, provable algorithms for isotonic regression in all \(l_p\)-norms. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 2719–2727. Curran Associates, Inc. (2015)

    Google Scholar 

  19. Lee, D.T., Preparata, F.P.: Euclidean shortest paths in the presence of rectilinear barriers. Networks 14(3), 393–410 (1984). https://doi.org/10.1002/net.3230140304

    Article  MathSciNet  MATH  Google Scholar 

  20. Lim, C.H.: An efficient pruning algorithm for robust isotonic regression. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31, pp. 219–229. Curran Associates, Inc. (2018)

    Google Scholar 

  21. Luss, R., Rosset, S.: Generalized isotonic regression. J. Comput. Graph. Stat. 23(1), 192–210 (2014). https://doi.org/10.1080/10618600.2012.741550

    Article  MathSciNet  MATH  Google Scholar 

  22. Mazumder, R., Choudhury, A., Iyengar, G., Sen, B.: A computational framework for multivariate convex regression and its variants. J. Am. Stat. Assoc. 1–14 (2018). https://doi.org/10.1080/01621459.2017.1407771

    Article  MathSciNet  Google Scholar 

  23. Rote, G.: Isotonic regression by dynamic programming. In: Fineman, J.T., Mitzenmacher, M. (eds.) Symposium on Simplicity in Algorithms (SOSA 2019), OASIcs, vol. 69, pp. 1:1–1:18. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik (2018). https://doi.org/10.4230/OASIcs.SOSA.2019.1

  24. Stout, Q.F.: Unimodal regression via prefix isotonic regression. Comput. Stat. Data Anal. 53(2), 289–297 (2008). https://doi.org/10.1016/j.csda.2008.08.005

    Article  MathSciNet  MATH  Google Scholar 

  25. Stout, Q.F.: Fastest isotonic regression algorithms (2014). http://web.eecs.umich.edu/~qstout/IsoRegAlg.pdf

  26. Tsourakakis, C.E., Peng, R., Tsiarli, M.A., Miller, G.L., Schwartz, R.: Approximation algorithms for speeding up dynamic programming and denoising aCGH data. J. Exp. Algorithmics 16, 1 (2011). https://doi.org/10.1145/1963190.2063517

    Article  MathSciNet  MATH  Google Scholar 

  27. Yao, F.F.: Efficient dynamic programming using quadrangle inequalities. In: Symposium on Theory of Computing (STOC). ACM Press (1980). https://doi.org/10.1145/800141.804691

  28. Yao, F.F.: Speed-up in dynamic programming. SIAM J. Algebraic Discrete Methods 3(4), 532–540 (1982). https://doi.org/10.1137/0603055

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

We thank Richard Peng, Sushant Sachdeva, and Danny Sleator for insightful discussions, and our anonymous referees for further relevant references and insightful comments that significantly improved the presentation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sebastian Wild .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Durfee, D., Gao, Y., Rao, A.B., Wild, S. (2019). Efficient Second-Order Shape-Constrained Function Fitting. In: Friggstad, Z., Sack, JR., Salavatipour, M. (eds) Algorithms and Data Structures. WADS 2019. Lecture Notes in Computer Science(), vol 11646. Springer, Cham. https://doi.org/10.1007/978-3-030-24766-9_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-24766-9_29

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-24765-2

  • Online ISBN: 978-3-030-24766-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics