Prof. Joakim Nivre
Bare-Bones Dependency Parsing — A Case for Occam's Razor?

Uppsala University (homepage)


The notion of dependency has come to play an increasingly central role in natural language parsing in recent years. On the one hand, lexical dependencies have been incorporated in statistical models for a variety of syntactic representations. On the other hand, dependency relations extracted from such representations have been exploited in many practical applications. Given these developments, it is not surprising that there has also been a growing interest in parsing models that map sentences directly to dependency trees, an approach that may be called "bare-bones dependency parsing" to distinguish it from parsing methods where dependencies are embedded into or extracted from other types of syntactic representations. In this talk, I will survey recent advances in bare-bones dependency parsing, covering all major approaches but focusing on transition-based methods for highly efficient parsing. I will specifically address the question of how such systems can handle long-distance dependencies and other phenomena that have been argued to require richer representations, and I will discuss recent work that attempts to evaluate bare-bones dependency parsers in relation to other methods for producing dependency trees. I will conclude with some thoughts on the most important challenges for the future.

Prof. Bonnie Webber
Discourse Structures and Language Technologies

University of Edinburgh (homepage)


In this talk I explore how, on the one hand, discourse structures can (or promise to) help to improve language technologies, and, on the other, language technologies can help to induce and model discourse structures. As much of this depends on understanding the features of different discourse structures, I'll spend some of talk describing them.

Prof. Guntis Bārzdiņš
When FrameNet meets a Controlled Natural Language

University of Latvia (homepage)


There are two approaches to the natural language processing — one is going in width to cover at shallow level (parsing, syntax) the rich linguistic variety found in the natural language, while another is going in depth (semantics, discourse structure) for a monosemous subset of natural language referred to as a controlled natural language (CNL). Today we are nowhere near to bridging the gap between the two approaches. In this presentation I will argue that despite elusiveness of this goal, FrameNet might provide a sufficient insight into the deeper semantic layers of the natural language to envision a new kind of rich CNL narrowing the gap with true natural language. A blueprint for PAO, a procedural CNL of such new kind will be discused as an example.