fbpx
Wikipedia

Pachinko allocation

In machine learning and natural language processing, the pachinko allocation model (PAM) is a topic model. Topic models are a suite of algorithms to uncover the hidden thematic structure of a collection of documents. [1] The algorithm improves upon earlier topic models such as latent Dirichlet allocation (LDA) by modeling correlations between topics in addition to the word correlations which constitute topics. PAM provides more flexibility and greater expressive power than latent Dirichlet allocation.[2] While first described and implemented in the context of natural language processing, the algorithm may have applications in other fields such as bioinformatics. The model is named for pachinko machines—a game popular in Japan, in which metal balls bounce down around a complex collection of pins until they land in various bins at the bottom.[3]

History edit

Pachinko allocation was first described by Wei Li and Andrew McCallum in 2006.[3] The idea was extended with hierarchical Pachinko allocation by Li, McCallum, and David Mimno in 2007.[4] In 2007, McCallum and his colleagues proposed a nonparametric Bayesian prior for PAM based on a variant of the hierarchical Dirichlet process (HDP).[2] The algorithm has been implemented in the MALLET software package published by McCallum's group at the University of Massachusetts Amherst.

Model edit

PAM connects words in V and topics in T with an arbitrary directed acyclic graph (DAG), where topic nodes occupy the interior levels and the leaves are words.

The probability of generating a whole corpus is the product of the probabilities for every document:[3]

 

See also edit

References edit

  1. ^ Blei, David. . Archived from the original on 2 October 2012. Retrieved 4 October 2012.
  2. ^ a b Li, Wei; Blei, David; McCallum, Andrew (2007). "Nonparametric Bayes Pachinko Allocation". arXiv:1206.5270. {{cite journal}}: Cite journal requires |journal= (help)
  3. ^ a b c Li, Wei; McCallum, Andrew (2006). "Pachinko allocation: DAG-structured mixture models of topic correlations" (PDF). Proceedings of the 23rd international conference on Machine learning - ICML '06. pp. 577–584. doi:10.1145/1143844.1143917. ISBN 1595933832. S2CID 13160178.
  4. ^ Mimno, David; Li, Wei; McCallum, Andrew (2007). "Mixtures of hierarchical topics with Pachinko allocation" (PDF). Proceedings of the 24th international conference on Machine learning. pp. 633–640. doi:10.1145/1273496.1273576. ISBN 9781595937933. S2CID 6045658.{{cite book}}: CS1 maint: date and year (link)
  5. ^ Hofmann, Thomas (1999). (PDF). Proceedings of the Twenty-Second Annual International SIGIR Conference on Research and Development in Information Retrieval. Archived from the original (PDF) on 14 December 2010.
  6. ^ Blei, David M.; Ng, Andrew Y.; Jordan, Michael I; Lafferty, John (January 2003). . Journal of Machine Learning Research. 3: pp. 993–1022. Archived from the original on 1 May 2012. Retrieved 19 July 2010.

External links edit

  • Mixtures of Hierarchical Topics with Pachinko Allocation, a video recording of David Mimno presenting HPAM in 2007.

pachinko, allocation, this, article, relies, excessively, references, primary, sources, please, improve, this, article, adding, secondary, tertiary, sources, find, sources, news, newspapers, books, scholar, jstor, september, 2010, learn, when, remove, this, te. This article relies excessively on references to primary sources Please improve this article by adding secondary or tertiary sources Find sources Pachinko allocation news newspapers books scholar JSTOR September 2010 Learn how and when to remove this template message In machine learning and natural language processing the pachinko allocation model PAM is a topic model Topic models are a suite of algorithms to uncover the hidden thematic structure of a collection of documents 1 The algorithm improves upon earlier topic models such as latent Dirichlet allocation LDA by modeling correlations between topics in addition to the word correlations which constitute topics PAM provides more flexibility and greater expressive power than latent Dirichlet allocation 2 While first described and implemented in the context of natural language processing the algorithm may have applications in other fields such as bioinformatics The model is named for pachinko machines a game popular in Japan in which metal balls bounce down around a complex collection of pins until they land in various bins at the bottom 3 Contents 1 History 2 Model 3 See also 4 References 5 External linksHistory editPachinko allocation was first described by Wei Li and Andrew McCallum in 2006 3 The idea was extended with hierarchical Pachinko allocation by Li McCallum and David Mimno in 2007 4 In 2007 McCallum and his colleagues proposed a nonparametric Bayesian prior for PAM based on a variant of the hierarchical Dirichlet process HDP 2 The algorithm has been implemented in the MALLET software package published by McCallum s group at the University of Massachusetts Amherst Model editThis section needs expansion You can help by adding to it July 2017 PAM connects words in V and topics in T with an arbitrary directed acyclic graph DAG where topic nodes occupy the interior levels and the leaves are words The probability of generating a whole corpus is the product of the probabilities for every document 3 P D a d P d a displaystyle P mathbf D alpha prod d P d alpha nbsp See also editProbabilistic latent semantic indexing PLSI an early topic model from Thomas Hofmann in 1999 5 Latent Dirichlet allocation a generalization of PLSI developed by David Blei Andrew Ng and Michael Jordan in 2002 allowing documents to have a mixture of topics 6 MALLET an open source Java library that implements Pachinko allocation References edit Blei David Topic modeling Archived from the original on 2 October 2012 Retrieved 4 October 2012 a b Li Wei Blei David McCallum Andrew 2007 Nonparametric Bayes Pachinko Allocation arXiv 1206 5270 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help a b c Li Wei McCallum Andrew 2006 Pachinko allocation DAG structured mixture models of topic correlations PDF Proceedings of the 23rd international conference on Machine learning ICML 06 pp 577 584 doi 10 1145 1143844 1143917 ISBN 1595933832 S2CID 13160178 Mimno David Li Wei McCallum Andrew 2007 Mixtures of hierarchical topics with Pachinko allocation PDF Proceedings of the 24th international conference on Machine learning pp 633 640 doi 10 1145 1273496 1273576 ISBN 9781595937933 S2CID 6045658 a href Template Cite book html title Template Cite book cite book a CS1 maint date and year link Hofmann Thomas 1999 Probabilistic Latent Semantic Indexing PDF Proceedings of the Twenty Second Annual International SIGIR Conference on Research and Development in Information Retrieval Archived from the original PDF on 14 December 2010 Blei David M Ng Andrew Y Jordan Michael I Lafferty John January 2003 Latent Dirichlet allocation Journal of Machine Learning Research 3 pp 993 1022 Archived from the original on 1 May 2012 Retrieved 19 July 2010 External links editMixtures of Hierarchical Topics with Pachinko Allocation a video recording of David Mimno presenting HPAM in 2007 nbsp This computer science article is a stub You can help Wikipedia by expanding it vte Retrieved from https en wikipedia org w index php title Pachinko allocation amp oldid 1172211556, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.