|
Grammatical Models for Constituency and Dependency Structures| old_uid | 9238 |
|---|
| title | Grammatical Models for Constituency and Dependency Structures |
|---|
| start_date | 2010/11/12 |
|---|
| schedule | 11h-13h |
|---|
| online | no |
|---|
| location_info | etage 3, salle 3E91 |
|---|
| summary | In this talk I would like to present two areas of research I'm working on. They are both concerned with corpus-based analyses of syntactic structures, and formulation of statistical models for parsing.
In the first part I will adopt Phrase Structures (PS) as the underlying syntactic representation, and Data Oriented Parsing (DOP) as the grammatical framework. A general assumption in many linguistic theories is to consider a syntactic construction linguistically relevant if there is some empirical evidence about its reusability in a representative corpus of examples. Using this intuition, I will show how by adopting a kernel based methodology it is possible to efficiently identify all tree fragments recurring multiple times in any large treebank. I will illustrate how this can be useful for guiding linguistic analysis as well as parsing novel sentences.
In the second part I will introduce a novel syntactic dependency representation (TDS) inspired by the work of Lucien Tesnière. In this work we have attempted to go back to the roots of dependency theory, and formulate a way to transform the English WSJ treebank into a novel DS notation, which we claim to be closer to the original formulation with respect to other DS conversions. I will show how TDS can incorporate all main advantages of both PS and modern DS, while avoiding well known problems concerning the choice of heads, and better representing common linguistic phenomena such as coordination. Finally I will present some preliminary results of parsing TDS. |
|---|
| responsibles | Crabbé |
|---|
| |
|