Distributional learning of context–free and multiple context–free grammars

Alexander Clark, Ryo Yoshinaka

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

11 Citations (Scopus)

Abstract

This chapter reviews recent progress in distributional learning in grammatical inference as applied to learning context-free and multiple context-free grammars.We discuss the basic principles of distributional learning, and present two classes of representations, primal and dual, where primal approaches use nonterminals based on strings or sets of strings and dual approaches use nonterminals based on contexts or sets of contexts.We then present learning algorithms based on these two models using a variety of learning paradigms, and then discuss the natural extension to mildly context-sensitive formalisms, using multiple context-free grammars as a representative formalism.

Original languageEnglish
Title of host publicationTopics in Grammatical Inference
PublisherSpringer Berlin Heidelberg
Pages143-172
Number of pages30
ISBN (Electronic)9783662483954
ISBN (Print)9783662483930
DOIs
Publication statusPublished - 2016 Jan 1

Fingerprint

Dive into the research topics of 'Distributional learning of context–free and multiple context–free grammars'. Together they form a unique fingerprint.

Cite this