Enhancing Automata Learning by Log-Based Metrics
- Cite this paper as:
- van den Bos P., Smetsers R., Vaandrager F. (2016) Enhancing Automata Learning by Log-Based Metrics. In: Ábrahám E., Huisman M. (eds) Integrated Formal Methods. IFM 2016. Lecture Notes in Computer Science, vol 9681. Springer, Cham
We study a general class of distance metrics for deterministic Mealy machines. The metrics are induced by weight functions that specify the relative importance of input sequences. By choosing an appropriate weight function we may fine-tune a metric so that it captures some intuitive notion of quality. In particular, we present a metric that is based on the minimal number of inputs that must be provided to obtain a counterexample, starting from states that can be reached by a given set of logs. For any weight function, we may boost the performance of existing model learning algorithms by introducing an extra component, which we call the Comparator. Preliminary experiments show that use of the Comparator yields a significant reduction of the number of inputs required to learn correct models, compared to current state-of-the-art algorithms. In existing automata learning algorithms, the quality of subsequent hypotheses may decrease. Generalising a result of Smetsers et al., we show that the quality of hypotheses that are generated by the Comparator never decreases.