Variational Bayesian mixture model on a subspace of exponential family distributions

Kazuho Watanabe, Shotaro Akaho, Shinichiro Omachi, Masato Okada

研究成果: Article査読

16 被引用数 (Scopus)

抄録

Exponential principal component analysis (e-PCA) has been proposed to reduce the dimension of the parameters of probability distributions using Kullback information as a distance between two distributions. It also provides a framework for dealing with various data types such as binary and integer for which the Gaussian assumption on the data distribution is inappropriate. In this paper, we introduce a latent variable model for the e-PCA. Assuming the discrete distribution on the latent variable leads to mixture models with constraint on their parameters. This provides a framework for clustering on the lower dimensional subspace of exponential family distributions. We derive a learning algorithm for those mixture models based on the variational Bayes (VB) method. Although intractable integration is required to implement the algorithm for a subspace, an approximation technique using Laplace's method allows us to carry out clustering on an arbitrary subspace. Combined with the estimation of the subspace, the resulting algorithm performs simultaneous dimensionality reduction and clustering. Numerical experiments on synthetic and real data demonstrate its effectiveness for extracting the structures of data as a visualization technique and its high generalization ability as a density estimation model.

本文言語English
論文番号5247016
ページ(範囲)1783-1796
ページ数14
ジャーナルIEEE Transactions on Neural Networks
20
11
DOI
出版ステータスPublished - 2009 11月 1

ASJC Scopus subject areas

  • ソフトウェア
  • コンピュータ サイエンスの応用
  • コンピュータ ネットワークおよび通信
  • 人工知能

フィンガープリント

「Variational Bayesian mixture model on a subspace of exponential family distributions」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル