Relating competence and performance in phonological encoding through neural network computation

old_uid7275
titleRelating competence and performance in phonological encoding through neural network computation
start_date2009/06/30
schedule15h45-17h
onlineno
location_infoBig Conference Room (1.63)
summaryWhile activation-based models capture important aspects of linguistic performance, simple implementations through local connectionist networks are unable to incorporate other important aspects of performance which are related to linguistic competence. Linguistic theory argues that competence requires representations with rich discrete structure - such as hierarchically related constituents - and discrete knowledge about the well-formedness of these structures - grammar. Such discrete structure and knowledge is lacking in the representations and processing of simple local connectionist networks. This talk will present early work on phonological encoding that proposes: (1) theoretical principles which integrate activation-based approaches to linguistic performance with a recent approach to linguistic competence, Optimality Theory (Prince & Smolensky 1993/2004), and (2) a connectionist implementation of these principles. The network realization combines general formal techniques for: (i) realizing discrete symbolic structures in distributed activation patterns within the network; (ii) realizing Optimality-Theoretic grammatical constraints on these structures in the connections of the network: the grammatical well-formedness of a linguistic structure is realized as the connectionist well-formedness of the corresponding activation pattern; (iii) optimizing over these constraints to generate grammatically well-formed representations by spreading activation through the network connections; and (iv) selecting a coherent response that corresponds to a single discrete structure (not a mixture of structures). By considering extremely simple cases we will examine the theory’s ability to explain - and ground in processing mechanisms - some very general relations between grammatical well-formedness and performance. Joint work with Matthew Goldrick (Linguistics Department, Northwestern University) & Donald Mathis (Cognitive Science Department, Johns Hopkins University)
responsiblesZondervan