Item Details


Probabilistic topic modeling in multilingual settings: An overview of its methodology and applications ontoligent http://zotero.org/users/10812 http://zotero.org/users/10812/items/Y3QE26FL 2017-11-28T00:03:50Z 2017-11-28T00:03:50Z Y3QE26FL 4758 journalArticle Vulić et al. 2015-01-01 2
Type Journal Article
Author Ivan Vulić
Author Wim De Smet
Author Jie Tang
Author Marie-Francine Moens
URL http://www.sciencedirect.com/science/article/pii/S0306457314000739
Volume 51
Issue 1
Pages 111-147
Publication Information Processing & Management
ISSN 0306-4573
Date January 1, 2015
Journal Abbr Information Processing & Management
DOI 10.1016/j.ipm.2014.08.003
Accessed 2017-11-28 00:03:50
Library Catalog ScienceDirect
Abstract Probabilistic topic models are unsupervised generative models which model document content as a two-step generation process, that is, documents are observed as mixtures of latent concepts or topics, while topics are probability distributions over vocabulary words. Recently, a significant research effort has been invested into transferring the probabilistic topic modeling concept from monolingual to multilingual settings. Novel topic models have been designed to work with parallel and comparable texts. We define multilingual probabilistic topic modeling (MuPTM) and present the first full overview of the current research, methodology, advantages and limitations in MuPTM. As a representative example, we choose a natural extension of the omnipresent LDA model to multilingual settings called bilingual LDA (BiLDA). We provide a thorough overview of this representative multilingual model from its high-level modeling assumptions down to its mathematical foundations. We demonstrate how to use the data representation by means of output sets of (i) per-topic word distributions and (ii) per-document topic distributions coming from a multilingual probabilistic topic model in various real-life cross-lingual tasks involving different languages, without any external language pair dependent translation resource: (1) cross-lingual event-centered news clustering, (2) cross-lingual document classification, (3) cross-lingual semantic similarity, and (4) cross-lingual information retrieval. We also briefly review several other applications present in the relevant literature, and introduce and illustrate two related modeling concepts: topic smoothing and topic pruning. In summary, this article encompasses the current research in multilingual probabilistic topic modeling. By presenting a series of potential applications, we reveal the importance of the language-independent and language pair independent data representations by means of MuPTM. We provide clear directions for future research in the field by providing a systematic overview of how to link and transfer aspect knowledge across corpora written in different languages via the shared space of latent cross-lingual topics, that is, how to effectively employ learned per-topic word distributions and per-document topic distributions of any multilingual probabilistic topic model in various cross-lingual applications.
Short Title Probabilistic topic modeling in multilingual settings

Attachments