Learning Topics Using Semantic Locality

Ziyi Zhao, Krittaphat Pugdeethosapol, Sheng Lin, Zhe Li, Caiwen Ding, Yanzhi Wang, Qinru Qiu

Research output: Chapter in Book/Entry/PoemConference contribution

3 Scopus citations

Abstract

The topic modeling discovers the latent topic probability of the given text documents. To generate the more meaningful topic that better represents the given document, we proposed a new feature extraction technique which can be used in the data preprocessing stage. The method consists of three steps. First, it generates the word/word-pair from every single document. Second, it applies a two-way TF-IDF algorithm to word/word-pair for semantic filtering. Third, it uses the K-means algorithm to merge the word pairs that have the similar semantic meaning. Experiments are carried out on the Open Movie Database (OMDb), Reuters Dataset and 20NewsGroup Dataset. The mean Average Precision score is used as the evaluation metric. Comparing our results with other state-of-the-art topic models, such as Latent Dirichlet allocation and traditional Restricted Boltzmann Machines. Our proposed data preprocessing can improve the generated topic accuracy by up to 12.99 %.

Original languageEnglish (US)
Title of host publication2018 24th International Conference on Pattern Recognition, ICPR 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3710-3715
Number of pages6
ISBN (Electronic)9781538637883
DOIs
StatePublished - Nov 26 2018
Event24th International Conference on Pattern Recognition, ICPR 2018 - Beijing, China
Duration: Aug 20 2018Aug 24 2018

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2018-August
ISSN (Print)1051-4651

Other

Other24th International Conference on Pattern Recognition, ICPR 2018
Country/TerritoryChina
CityBeijing
Period8/20/188/24/18

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Learning Topics Using Semantic Locality'. Together they form a unique fingerprint.

Cite this