Distributional learning of conjunctive grammars and contextual binary feature grammars

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


Approaches based on the idea generically called distributional learning have been making great success in the algorithmic learning of several rich subclasses of context-free languages and their extensions. Those language classes are defined by properties concerning string-context relation. In this paper, we present a distributional learning algorithm for conjunctive grammars with the k-finite context property (k-FCP) for each natural number k. We also compare our result with the closely related work by Clark et al. (JMLR 2010) [5] on learning some context-free grammars using contextual binary feature grammars (CBFGs). We prove that the context-free grammars targeted by their algorithm have the k-FCP. Moreover, we show that every exact CBFG has the k-FCP, too, while not all of them are learnable by their algorithm. Clark et al. conjectured a learning algorithm for exact CBFGs should exist. This paper answers their conjecture in a positive way.

Original languageEnglish
Pages (from-to)359-374
Number of pages16
JournalJournal of Computer and System Sciences
Publication statusPublished - 2019 Sept


  • Conjunctive grammars
  • Context-free grammars
  • Contextual binary feature grammars
  • Distributional learning
  • Grammatical inference
  • Learning theory


Dive into the research topics of 'Distributional learning of conjunctive grammars and contextual binary feature grammars'. Together they form a unique fingerprint.

Cite this