Skip to main content

Sparse Relevance Kernel Machine-Based Performance Dependency Analysis of Analog and Mixed-Signal Circuits

  • Chapter
  • First Online:
Machine Learning in VLSI Computer-Aided Design
  • 2748 Accesses

Abstract

Design optimization, verification, and failure diagnosis of analog and mixed-signal (AMS) circuits requires accurate models that can reliably capture complex dependencies of circuit performances on essential circuit and test parameters, such as design parameters, process variations, and test signatures. We present a novel Bayesian learning technique, namely sparse relevance kernel machine (SRKM), for characterizing analog circuits with sparse statistical regression models. SRKM produces more reliable classification models learned from simulation data with a limited number of samples but a large number of parameters, and also computes a probabilistically inferred weighting factor quantifying the criticality of each parameter as part of the overall learning framework, hence offering a powerful enabler for variability modeling, failure diagnosis, and test development. Compared to other popular learning-based techniques, the proposed SRKM produces more accurate models, requires less amount of training data, and extracts more reliable parametric ranking. The effectiveness of SRKM is demonstrated in examples including statistical variability modeling of a low-dropout regulator (LDO), built-in self-test (BIST) development of a charge-pump phase-locked loop (PLL), and applications of building statistical variability models for a commercial automotive interface design.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. P. Bastani, N. Callegari, L.C. Wang, M.S. Abadir, Statistical diagnosis of unmodeled systematic timing effects, in Proceedings of the 45th Annual Design Automation Conference (ACM, New York, 2008), pp. 355–360

    Google Scholar 

  2. C.M. Bishop, Neural Networks for Pattern Recognition (Oxford University Press, Oxford, 1995)

    MATH  Google Scholar 

  3. A.L. Blum, P. Langley, Selection of relevant features and examples in machine learning. Artif. Intell. 97(1), 245–271 (1997)

    MathSciNet  MATH  Google Scholar 

  4. A. Bounceur, B. Brahmi, K. Beznia, R. Euler, Accurate analog/RF BIST evaluation based on SVM classification of the process parameters, in 2014 9th International Design & Test Symposium (IDT) (IEEE, Piscataway, 2014), pp. 55–60

    Google Scholar 

  5. G. Chandrashekar, F. Sahin, A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)

    Google Scholar 

  6. H. Cheng, H. Chen, G. Jiang, K. Yoshihira, Nonlinear feature selection by relevance feature vector machine, in Machine Learning and Data Mining in Pattern Recognition (Springer, Berlin, 2007), pp. 144–159

    Google Scholar 

  7. B.R. Cobb, P.P. Shenoy, Nonlinear deterministic relationships in Bayesian networks, in Symbolic and Quantitative Approaches to Reasoning with Uncertainty (Springer, Berlin, 2005), pp. 27–38

    MATH  Google Scholar 

  8. M. Fernández-Delgado, E. Cernadas, S. Barro, D. Amorim, Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)

    MathSciNet  MATH  Google Scholar 

  9. I. Guyon, S. Gunn, M. Nikravesh, L.A. Zadeh, Feature Extraction: Foundations and Applications, vol. 207 (Springer, Berlin, 2008)

    MATH  Google Scholar 

  10. S. Hochreiter, K. Obermayer, Nonlinear feature selection with the potential support vector machine, in Feature Extraction (Springer, Berlin, 2006), pp. 419–438

    Google Scholar 

  11. S.W. Hsiao, N. Tzou, A. Chatterjee, A programmable BIST design for PLL static phase offset estimation and clock duty cycle detection, in 2013 IEEE 31st VLSI Test Symposium (VTS) (IEEE, Piscataway, 2013), pp. 1–6

    Google Scholar 

  12. X. Huang, L. Shi, J.A. Suykens, Ramp loss linear programming support vector machine. J. Mach. Learn. Res. 15(1), 2185–2211 (2014)

    MathSciNet  MATH  Google Scholar 

  13. S.S. Keerthi, O. Chapelle, D. DeCoste, Building support vector machines with reduced classifier complexity. J. Mach. Learn. Res. 7, 1493–1515 (2006)

    MathSciNet  MATH  Google Scholar 

  14. D.E. King, Dlib-ml: a machine learning toolkit. J. Mach. Learn. Res. 10, 1755–1758 (2009)

    Google Scholar 

  15. S. Lai, Modeling, design and optimization of IC power delivery with on-chip regulation. Doctoral dissertation, Texas A&M University, 2014

    Google Scholar 

  16. S. Lai, P. Li, A fully on-chip area-efficient CMOS low-dropout regulator with fast load regulation. Analog Integr. Circuits Signal Process. 72(2), 433–450 (2012)

    Google Scholar 

  17. F. Li, Y. Yang, E.P. Xing, From lasso regression to feature vector machine, in Advances in Neural Information Processing Systems (2005), pp. 779–786

    Google Scholar 

  18. H. Lin, Algorithms for verification of analog and mixed-signal integrated circuits. Doctoral dissertation, Texas A&M University, 2016

    Google Scholar 

  19. H. Lin, P. Li, Circuit performance classification with active learning guided sampling for support vector machines. IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. 34(9), 1467–1480 (2015). https://doi.org/10.1109/TCAD.2015.2413840

    Google Scholar 

  20. H. Lin, P. Li, Relevance vector and feature machine for statistical analog circuit characterization and built-in self-test optimization, in Proceedings of the 53rd Annual Design Automation Conference (2016), pp. 11–16

    Google Scholar 

  21. D.J. MacKay, The evidence framework applied to classification networks. Neural Comput. 4(5), 720–736 (1992)

    Google Scholar 

  22. S. Maltabas, O.K. Ekekon, K. Kulovic, A. Meixner, M. Margala, An IDDQ BIST approach to characterize phase-locked loop parameters, in 2013 IEEE 31st VLSI Test Symposium (VTS) (IEEE, Piscataway, 2013), pp. 1–6

    Google Scholar 

  23. M. Neil, M. Tailor, D. Marquez, Inference in hybrid Bayesian networks using dynamic discretization. Stat. Comput. 17(3), 219–233 (2007)

    MathSciNet  Google Scholar 

  24. J. Neumann, C. Schnörr, G. Steidl, Combined SVM-based feature selection and classification. Mach. Learn. 61(1–3), 129–150 (2005)

    MATH  Google Scholar 

  25. B. Schölkopf, A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (MIT Press, Cambridge, 2002)

    Google Scholar 

  26. A. Smola, B. Scholkopf, G. Ratsch, Linear programs for automatic accuracy control in regression, in Ninth International Conference on (Conf. Publ. No. 470) Artificial Neural Networks, 1999 (ICANN 99), vol. 2 (IET, Stevenage, 1999), pp. 575–580

    Google Scholar 

  27. P. Somol, P. Pudil, J. Novovičová, P. Paclık, Adaptive floating search methods in feature selection. Pattern Recogn. Lett. 20(11), 1157–1163 (1999)

    Google Scholar 

  28. J.A. Suykens, J. Vandewalle, Least squares support vector machine classifiers. Neural Process. Lett. 9(3), 293–300 (1999)

    Google Scholar 

  29. R. Tibshirani, Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Methodol. 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  30. M.E. Tipping, Sparse Bayesian learning and the relevance vector machine. J. Mach. Learn. Res. 1, 211–244 (2001)

    MathSciNet  MATH  Google Scholar 

  31. M.E. Tipping, An efficient matlab implementation of the sparse Bayesian modelling algorithm (version 2.0). Vector Anomaly, March 2009

    Google Scholar 

  32. M.E. Tipping, A.C. Faul et al., Fast marginal likelihood maximisation for sparse Bayesian models, in Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics (2003)

    Google Scholar 

  33. V.N. Vapnik, Statistical Learning Theory (Wiley, New York, 1998)

    MATH  Google Scholar 

  34. V. Vapnik, S. Golowich, A. Smola, Support vector method for function approximation, regression estimation, and signal processing, in Advances in Neural Information Processing Systems (1997), pp. 281–287

    Google Scholar 

  35. D. Ververidis, C. Kotropoulos, Fast and accurate sequential floating forward feature selection with the Bayes classifier applied to speech emotion recognition. Signal Process. 88(12), 2956–2970 (2008)

    MATH  Google Scholar 

  36. F. Wang, M., Zaheer, X. Li, J.O. Plouchart, A. Valdes-Garcia, Co-learning Bayesian model fusion: efficient performance modeling of analog and mixed-signal circuits using side information, in Proceedings of the IEEE/ACM International Conference on Computer-Aided Design (IEEE Press, Piscataway, 2015), pp. 575–582

    Google Scholar 

  37. P.M. Williams, Bayesian regularization and pruning using a Laplace prior. Neural Comput. 7(1), 117–143 (1995)

    Google Scholar 

  38. D.P. Wipf, B.D. Rao, An empirical Bayesian strategy for solving the simultaneous sparse approximation problem. IEEE Trans. Signal Process. 55(7), 3704–3716 (2007)

    MathSciNet  MATH  Google Scholar 

  39. H. Xu, C. Caramanis, S. Mannor, Robustness and regularization of support vector machines. J. Mach. Learn. Res. 10, 1485–1510 (2009)

    MathSciNet  MATH  Google Scholar 

  40. G. Yu, P. Li, A methodology for systematic built-in self-test of phase-locked loops targeting at parametric failures, in IEEE International Test Conference, 2007 (ITC 2007) (IEEE, Piscataway, 2007), pp. 1–10

    Google Scholar 

Download references

Acknowledgements

This material is based upon work supported by the Semiconductor Research Corporation (SRC) through Texas Analog Center of Excellence at the University of Texas at Dallas (Task ID:2712.004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Lin, H., Khan, A., Li, P. (2019). Sparse Relevance Kernel Machine-Based Performance Dependency Analysis of Analog and Mixed-Signal Circuits. In: Elfadel, I., Boning, D., Li, X. (eds) Machine Learning in VLSI Computer-Aided Design. Springer, Cham. https://doi.org/10.1007/978-3-030-04666-8_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04666-8_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04665-1

  • Online ISBN: 978-3-030-04666-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics