Grammars are imperfect models of linguistic behavior. To the extent that we are more interested in competence than in performance (see Section 3.1), this is actually desirable, but more typically discrepancies between the predictions of the model and the observables represent serious over- or undergeneration (see Section 2.2). There is, moreover, an important range of models and phenomena where it is not quite obvious which of the cases above obtain. Suppose the task is to predict the rest of the series 2, 3, 5, …. A number of attractive hypotheses present themselves: the prime numbers, the Fibonacci numbers, square-free numbers, the sequence 2, 3, 5, 2, 3, 5, 2, 3, 5, …, and so on. The empirically minded reader may object that the situation will be greatly simplified if we obtain a few more data points, but this is quite often impossible: the set of actual human languages cannot be extended at will.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Kornai, A. (2008). Complexity. In: Mathematical Linguistics. Advanced Information and Knowledge Processing. Springer, London. https://doi.org/10.1007/978-1-84628-986-6_7
Download citation
DOI: https://doi.org/10.1007/978-1-84628-986-6_7
Publisher Name: Springer, London
Print ISBN: 978-1-84628-985-9
Online ISBN: 978-1-84628-986-6
eBook Packages: Computer ScienceComputer Science (R0)