PAC-Learnability of Probabilistic Deterministic Finite State Automata in Terms of Variation Distance
We consider the problem of PAC-learning distributions over strings, represented by probabilistic deterministic finite automata (PDFAs). PDFAs are a probabilistic model for the generation of strings of symbols, that have been used in the context of speech and handwriting recognition, and bioinformatics. Recent work on learning PDFAs from random examples has used the KL-divergence as the error measure; here we use the variation distance. We build on recent work by Clark and Thollard, and show that the use of the variation distance allows simplifications to be made to the algorithms, and also a strengthening of the results; in particular that using the variation distance, we obtain polynomial sample size bounds that are independent of the expected length of strings.
Unable to display preview. Download preview PDF.
- 1.Abe, N., Takeuchi, J., Warmuth, M.: Polynomial Learnability of Stochastic Rules with respect to the KL-divergence and Quadratic Distance. IEICE Trans. Inf. and Syst. E84-D(3), 299–315 (2001)Google Scholar
- 7.de la Higuera, C., Oncina, J.: Learning Probabilistic Finite Automata. tech. rept. EURISE, Université de Saint-Etienne and Departamento de Lenguajes y Sistemas Informaticos (2002)Google Scholar
- 8.Kearns, M., Mansour, Y., Ron, D., Rubinfeld, R., Schapire, R.E., Sellie, L.: On the Learnability of Discrete Distributions. In: Procs. of STOC, pp. 273–282 (1994)Google Scholar
- 9.Palmer, N., Goldberg, P.W.: PAC Classification via PAC Estimates of Label Class Distributions. Tech rept. 411, Dept. of Computer Science, University of Warwick (2004)Google Scholar