Javascript must be enabled to continue!
Learnability in Optimality Theory
View through CrossRef
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability.
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability. They show how the core principles of OT lead to the learning principle of constraint demotion, the basis for a family of algorithms that infer constraint rankings from linguistic forms.
Of primary concern to the authors are the ambiguity of the data received by the learner and the resulting interdependence of the core grammar and the structural analysis of overt linguistic forms. The authors argue that iterative approaches to interdependencies, inspired by work in statistical learning theory, can be successfully adapted to address the interdependencies of language learning. Both OT and Constraint Demotion play critical roles in their adaptation. The authors support their findings both formally and through simulations. They also illustrate how their approach could be extended to other language learning issues, including subset relations and the learning of phonological underlying forms.
Title: Learnability in Optimality Theory
Description:
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability.
Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability.
They show how the core principles of OT lead to the learning principle of constraint demotion, the basis for a family of algorithms that infer constraint rankings from linguistic forms.
Of primary concern to the authors are the ambiguity of the data received by the learner and the resulting interdependence of the core grammar and the structural analysis of overt linguistic forms.
The authors argue that iterative approaches to interdependencies, inspired by work in statistical learning theory, can be successfully adapted to address the interdependencies of language learning.
Both OT and Constraint Demotion play critical roles in their adaptation.
The authors support their findings both formally and through simulations.
They also illustrate how their approach could be extended to other language learning issues, including subset relations and the learning of phonological underlying forms.
Related Results
Learnability
Learnability
Learnability is not exactly a new concept in information technology, nor in cognitive science. Learnability has been a key concept of usability (Folmer & Bosch, 2004) in the ar...
Learnability in Automated Driving (LiAD): Concepts for Applying Learnability Engineering (CALE) Based on Long-Term Learning Effects
Learnability in Automated Driving (LiAD): Concepts for Applying Learnability Engineering (CALE) Based on Long-Term Learning Effects
Learnability in Automated Driving (LiAD) is a neglected research topic, especially when considering the unpredictable and intricate ways humans learn to interact and use automated ...
Suboptimality in Perceptual Decision Making
Suboptimality in Perceptual Decision Making
Short AbstractHuman perceptual decisions are often described as optimal, but this view remains controversial. To elucidate the issue, we review the vast literature on suboptimaliti...
Papers on pragmasemantics
Papers on pragmasemantics
Optimality theory as used in linguistics (Prince & Smolensky, 1993/2004; Smolensky & Legendre, 2006) and cognitive psychology (Gigerenzer & Selten, 2001) is a theoretic...
Output-Driven Phonology
Output-Driven Phonology
This book presents the theory of output-driven maps and provides a fresh perspective on the extent to which phonologies can be characterized in terms of restrictions on outputs. Cl...
Case in Optimality Theory
Case in Optimality Theory
AbstractIn optimality theory, a grammar consists of a set of constraints which are violable and typically conflicting. That these constraints are violable implies that an output of...
Graphs whose \(l_p\)-optimal rankings are \(l_{\infty}\) Optimal
Graphs whose \(l_p\)-optimal rankings are \(l_{\infty}\) Optimal
A ranking on a graph G is a function f : V ( G ) → { 1 , 2 , … , k } with the following restriction: if f ( u ) = f ( v ) for any u , v ∈ V ( G ) , then on every u v path ...
Learnability Theory
Learnability Theory
AbstractThe issue of learnability has dominated the study of generative grammar for most of its history. The problem of learnability is how to account for the fact that any normal ...

