2009Learning Competence from Performance Data: Learnability and symbolic Simulated Annealing for OT and HG.
KNAW Academy Colloquium on Language Acquisition and Optimality Theory, July 2-3, 2009, Amsterdam. Poster.




Given a theory of what grammars consist of, a learning algorithm aims at finding the specific grammar that may have produced the learning data. Grammars are models for the linguistic competence of the speaker and of the hearer. In most approaches, learning data are thought of as being directly produced by the linguistic competence (hence, in the corresponding models, by grammars), which are therefore always grammatical. Alternatively, some random "noise" can be added to the data, referring in a vague way to speech errors by the speaker, acoustic distortion and parsing errors by the hearer. Yet, performance effects can be more complex than mere random noise.

In Optimality Theory, performance is seen as the algorithm implementing the grammar. Smolensky and Legendre (2006) developed a connectionist approach to performance, whereas the approach advocated independently by Bíró (2006) is purely symbolic. Bíró shows that this approach correctly predicts speech error patterns. His Simulated Annealing for Optimality Theory Algorithm introduces not only "fast speech forms" but also "irregularities". The latter are local optima that are globally suboptimal, but the algorithm returns them with a constant frequency, independently of the cooling schedule. The influence of "fast speech form" and of "irregularities" on learning has been first investigated in Biró (2007). This poster will elaborate on these results by demonstrating how the system's behaviour changes if Optimality Theory is replaced by a symbolic Harmony Grammar.



Tamás Bíró. Finding the Right Words: Implementing Optimality Theory with Simulated Annealing. PhD thesis, University of Groningen, 2006. ROA-896.

Tamás Biró. 'The benefits of errors: Learning an OT grammar with a structured candidate set'. In Proceedings of the Workshop on Cognitive Aspects of Computational Language Acquisition, pages 81–88, Prague, Czech Republic, June 2007.

Paul Smolensky and Géraldine Legendre (eds.). The Harmonic Mind: From Neural Computation to Optimality-Theoretic Grammar. MIT Press, Cambridge, 2006.