Abstract
Design optimization, verification, and failure diagnosis of analog and mixed-signal (AMS) circuits requires accurate models that can reliably capture complex dependencies of circuit performances on essential circuit and test parameters, such as design parameters, process variations, and test signatures. We present a novel Bayesian learning technique, namely sparse relevance kernel machine (SRKM), for characterizing analog circuits with sparse statistical regression models. SRKM produces more reliable classification models learned from simulation data with a limited number of samples but a large number of parameters, and also computes a probabilistically inferred weighting factor quantifying the criticality of each parameter as part of the overall learning framework, hence offering a powerful enabler for variability modeling, failure diagnosis, and test development. Compared to other popular learning-based techniques, the proposed SRKM produces more accurate models, requires less amount of training data, and extracts more reliable parametric ranking. The effectiveness of SRKM is demonstrated in examples including statistical variability modeling of a low-dropout regulator (LDO), built-in self-test (BIST) development of a charge-pump phase-locked loop (PLL), and applications of building statistical variability models for a commercial automotive interface design.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
P. Bastani, N. Callegari, L.C. Wang, M.S. Abadir, Statistical diagnosis of unmodeled systematic timing effects, in Proceedings of the 45th Annual Design Automation Conference (ACM, New York, 2008), pp. 355–360
C.M. Bishop, Neural Networks for Pattern Recognition (Oxford University Press, Oxford, 1995)
A.L. Blum, P. Langley, Selection of relevant features and examples in machine learning. Artif. Intell. 97(1), 245–271 (1997)
A. Bounceur, B. Brahmi, K. Beznia, R. Euler, Accurate analog/RF BIST evaluation based on SVM classification of the process parameters, in 2014 9th International Design & Test Symposium (IDT) (IEEE, Piscataway, 2014), pp. 55–60
G. Chandrashekar, F. Sahin, A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)
H. Cheng, H. Chen, G. Jiang, K. Yoshihira, Nonlinear feature selection by relevance feature vector machine, in Machine Learning and Data Mining in Pattern Recognition (Springer, Berlin, 2007), pp. 144–159
B.R. Cobb, P.P. Shenoy, Nonlinear deterministic relationships in Bayesian networks, in Symbolic and Quantitative Approaches to Reasoning with Uncertainty (Springer, Berlin, 2005), pp. 27–38
M. Fernández-Delgado, E. Cernadas, S. Barro, D. Amorim, Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)
I. Guyon, S. Gunn, M. Nikravesh, L.A. Zadeh, Feature Extraction: Foundations and Applications, vol. 207 (Springer, Berlin, 2008)
S. Hochreiter, K. Obermayer, Nonlinear feature selection with the potential support vector machine, in Feature Extraction (Springer, Berlin, 2006), pp. 419–438
S.W. Hsiao, N. Tzou, A. Chatterjee, A programmable BIST design for PLL static phase offset estimation and clock duty cycle detection, in 2013 IEEE 31st VLSI Test Symposium (VTS) (IEEE, Piscataway, 2013), pp. 1–6
X. Huang, L. Shi, J.A. Suykens, Ramp loss linear programming support vector machine. J. Mach. Learn. Res. 15(1), 2185–2211 (2014)
S.S. Keerthi, O. Chapelle, D. DeCoste, Building support vector machines with reduced classifier complexity. J. Mach. Learn. Res. 7, 1493–1515 (2006)
D.E. King, Dlib-ml: a machine learning toolkit. J. Mach. Learn. Res. 10, 1755–1758 (2009)
S. Lai, Modeling, design and optimization of IC power delivery with on-chip regulation. Doctoral dissertation, Texas A&M University, 2014
S. Lai, P. Li, A fully on-chip area-efficient CMOS low-dropout regulator with fast load regulation. Analog Integr. Circuits Signal Process. 72(2), 433–450 (2012)
F. Li, Y. Yang, E.P. Xing, From lasso regression to feature vector machine, in Advances in Neural Information Processing Systems (2005), pp. 779–786
H. Lin, Algorithms for verification of analog and mixed-signal integrated circuits. Doctoral dissertation, Texas A&M University, 2016
H. Lin, P. Li, Circuit performance classification with active learning guided sampling for support vector machines. IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. 34(9), 1467–1480 (2015). https://doi.org/10.1109/TCAD.2015.2413840
H. Lin, P. Li, Relevance vector and feature machine for statistical analog circuit characterization and built-in self-test optimization, in Proceedings of the 53rd Annual Design Automation Conference (2016), pp. 11–16
D.J. MacKay, The evidence framework applied to classification networks. Neural Comput. 4(5), 720–736 (1992)
S. Maltabas, O.K. Ekekon, K. Kulovic, A. Meixner, M. Margala, An IDDQ BIST approach to characterize phase-locked loop parameters, in 2013 IEEE 31st VLSI Test Symposium (VTS) (IEEE, Piscataway, 2013), pp. 1–6
M. Neil, M. Tailor, D. Marquez, Inference in hybrid Bayesian networks using dynamic discretization. Stat. Comput. 17(3), 219–233 (2007)
J. Neumann, C. Schnörr, G. Steidl, Combined SVM-based feature selection and classification. Mach. Learn. 61(1–3), 129–150 (2005)
B. Schölkopf, A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (MIT Press, Cambridge, 2002)
A. Smola, B. Scholkopf, G. Ratsch, Linear programs for automatic accuracy control in regression, in Ninth International Conference on (Conf. Publ. No. 470) Artificial Neural Networks, 1999 (ICANN 99), vol. 2 (IET, Stevenage, 1999), pp. 575–580
P. Somol, P. Pudil, J. Novovičová, P. Paclık, Adaptive floating search methods in feature selection. Pattern Recogn. Lett. 20(11), 1157–1163 (1999)
J.A. Suykens, J. Vandewalle, Least squares support vector machine classifiers. Neural Process. Lett. 9(3), 293–300 (1999)
R. Tibshirani, Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Methodol. 58, 267–288 (1996)
M.E. Tipping, Sparse Bayesian learning and the relevance vector machine. J. Mach. Learn. Res. 1, 211–244 (2001)
M.E. Tipping, An efficient matlab implementation of the sparse Bayesian modelling algorithm (version 2.0). Vector Anomaly, March 2009
M.E. Tipping, A.C. Faul et al., Fast marginal likelihood maximisation for sparse Bayesian models, in Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics (2003)
V.N. Vapnik, Statistical Learning Theory (Wiley, New York, 1998)
V. Vapnik, S. Golowich, A. Smola, Support vector method for function approximation, regression estimation, and signal processing, in Advances in Neural Information Processing Systems (1997), pp. 281–287
D. Ververidis, C. Kotropoulos, Fast and accurate sequential floating forward feature selection with the Bayes classifier applied to speech emotion recognition. Signal Process. 88(12), 2956–2970 (2008)
F. Wang, M., Zaheer, X. Li, J.O. Plouchart, A. Valdes-Garcia, Co-learning Bayesian model fusion: efficient performance modeling of analog and mixed-signal circuits using side information, in Proceedings of the IEEE/ACM International Conference on Computer-Aided Design (IEEE Press, Piscataway, 2015), pp. 575–582
P.M. Williams, Bayesian regularization and pruning using a Laplace prior. Neural Comput. 7(1), 117–143 (1995)
D.P. Wipf, B.D. Rao, An empirical Bayesian strategy for solving the simultaneous sparse approximation problem. IEEE Trans. Signal Process. 55(7), 3704–3716 (2007)
H. Xu, C. Caramanis, S. Mannor, Robustness and regularization of support vector machines. J. Mach. Learn. Res. 10, 1485–1510 (2009)
G. Yu, P. Li, A methodology for systematic built-in self-test of phase-locked loops targeting at parametric failures, in IEEE International Test Conference, 2007 (ITC 2007) (IEEE, Piscataway, 2007), pp. 1–10
Acknowledgements
This material is based upon work supported by the Semiconductor Research Corporation (SRC) through Texas Analog Center of Excellence at the University of Texas at Dallas (Task ID:2712.004).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Lin, H., Khan, A., Li, P. (2019). Sparse Relevance Kernel Machine-Based Performance Dependency Analysis of Analog and Mixed-Signal Circuits. In: Elfadel, I., Boning, D., Li, X. (eds) Machine Learning in VLSI Computer-Aided Design. Springer, Cham. https://doi.org/10.1007/978-3-030-04666-8_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-04666-8_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04665-1
Online ISBN: 978-3-030-04666-8
eBook Packages: EngineeringEngineering (R0)