Learning Topics Using Semantic Locality

Ziyi Zhao, Krittaphat Pugdeethosapol, Sheng Lin, Zhe Li, Caiwen Ding, Yanzhi Wang, Qinru Qiu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The topic modeling discovers the latent topic probability of the given text documents. To generate the more meaningful topic that better represents the given document, we proposed a new feature extraction technique which can be used in the data preprocessing stage. The method consists of three steps. First, it generates the word/word-pair from every single document. Second, it applies a two-way TF-IDF algorithm to word/word-pair for semantic filtering. Third, it uses the K-means algorithm to merge the word pairs that have the similar semantic meaning. Experiments are carried out on the Open Movie Database (OMDb), Reuters Dataset and 20NewsGroup Dataset. The mean Average Precision score is used as the evaluation metric. Comparing our results with other state-of-the-art topic models, such as Latent Dirichlet allocation and traditional Restricted Boltzmann Machines. Our proposed data preprocessing can improve the generated topic accuracy by up to 12.99 %.

Original languageEnglish (US)
Title of host publication2018 24th International Conference on Pattern Recognition, ICPR 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3710-3715
Number of pages6
Volume2018-August
ISBN (Electronic)9781538637883
DOIs
StatePublished - Nov 26 2018
Event24th International Conference on Pattern Recognition, ICPR 2018 - Beijing, China
Duration: Aug 20 2018Aug 24 2018

Other

Other24th International Conference on Pattern Recognition, ICPR 2018
CountryChina
CityBeijing
Period8/20/188/24/18

Fingerprint

Semantics
Feature extraction
Experiments

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Cite this

Zhao, Z., Pugdeethosapol, K., Lin, S., Li, Z., Ding, C., Wang, Y., & Qiu, Q. (2018). Learning Topics Using Semantic Locality. In 2018 24th International Conference on Pattern Recognition, ICPR 2018 (Vol. 2018-August, pp. 3710-3715). [8546223] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICPR.2018.8546223

Learning Topics Using Semantic Locality. / Zhao, Ziyi; Pugdeethosapol, Krittaphat; Lin, Sheng; Li, Zhe; Ding, Caiwen; Wang, Yanzhi; Qiu, Qinru.

2018 24th International Conference on Pattern Recognition, ICPR 2018. Vol. 2018-August Institute of Electrical and Electronics Engineers Inc., 2018. p. 3710-3715 8546223.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zhao, Z, Pugdeethosapol, K, Lin, S, Li, Z, Ding, C, Wang, Y & Qiu, Q 2018, Learning Topics Using Semantic Locality. in 2018 24th International Conference on Pattern Recognition, ICPR 2018. vol. 2018-August, 8546223, Institute of Electrical and Electronics Engineers Inc., pp. 3710-3715, 24th International Conference on Pattern Recognition, ICPR 2018, Beijing, China, 8/20/18. https://doi.org/10.1109/ICPR.2018.8546223
Zhao Z, Pugdeethosapol K, Lin S, Li Z, Ding C, Wang Y et al. Learning Topics Using Semantic Locality. In 2018 24th International Conference on Pattern Recognition, ICPR 2018. Vol. 2018-August. Institute of Electrical and Electronics Engineers Inc. 2018. p. 3710-3715. 8546223 https://doi.org/10.1109/ICPR.2018.8546223
Zhao, Ziyi ; Pugdeethosapol, Krittaphat ; Lin, Sheng ; Li, Zhe ; Ding, Caiwen ; Wang, Yanzhi ; Qiu, Qinru. / Learning Topics Using Semantic Locality. 2018 24th International Conference on Pattern Recognition, ICPR 2018. Vol. 2018-August Institute of Electrical and Electronics Engineers Inc., 2018. pp. 3710-3715
@inproceedings{4aabfd3237a640aa988a0cac0514926e,
title = "Learning Topics Using Semantic Locality",
abstract = "The topic modeling discovers the latent topic probability of the given text documents. To generate the more meaningful topic that better represents the given document, we proposed a new feature extraction technique which can be used in the data preprocessing stage. The method consists of three steps. First, it generates the word/word-pair from every single document. Second, it applies a two-way TF-IDF algorithm to word/word-pair for semantic filtering. Third, it uses the K-means algorithm to merge the word pairs that have the similar semantic meaning. Experiments are carried out on the Open Movie Database (OMDb), Reuters Dataset and 20NewsGroup Dataset. The mean Average Precision score is used as the evaluation metric. Comparing our results with other state-of-the-art topic models, such as Latent Dirichlet allocation and traditional Restricted Boltzmann Machines. Our proposed data preprocessing can improve the generated topic accuracy by up to 12.99 {\%}.",
author = "Ziyi Zhao and Krittaphat Pugdeethosapol and Sheng Lin and Zhe Li and Caiwen Ding and Yanzhi Wang and Qinru Qiu",
year = "2018",
month = "11",
day = "26",
doi = "10.1109/ICPR.2018.8546223",
language = "English (US)",
volume = "2018-August",
pages = "3710--3715",
booktitle = "2018 24th International Conference on Pattern Recognition, ICPR 2018",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Learning Topics Using Semantic Locality

AU - Zhao, Ziyi

AU - Pugdeethosapol, Krittaphat

AU - Lin, Sheng

AU - Li, Zhe

AU - Ding, Caiwen

AU - Wang, Yanzhi

AU - Qiu, Qinru

PY - 2018/11/26

Y1 - 2018/11/26

N2 - The topic modeling discovers the latent topic probability of the given text documents. To generate the more meaningful topic that better represents the given document, we proposed a new feature extraction technique which can be used in the data preprocessing stage. The method consists of three steps. First, it generates the word/word-pair from every single document. Second, it applies a two-way TF-IDF algorithm to word/word-pair for semantic filtering. Third, it uses the K-means algorithm to merge the word pairs that have the similar semantic meaning. Experiments are carried out on the Open Movie Database (OMDb), Reuters Dataset and 20NewsGroup Dataset. The mean Average Precision score is used as the evaluation metric. Comparing our results with other state-of-the-art topic models, such as Latent Dirichlet allocation and traditional Restricted Boltzmann Machines. Our proposed data preprocessing can improve the generated topic accuracy by up to 12.99 %.

AB - The topic modeling discovers the latent topic probability of the given text documents. To generate the more meaningful topic that better represents the given document, we proposed a new feature extraction technique which can be used in the data preprocessing stage. The method consists of three steps. First, it generates the word/word-pair from every single document. Second, it applies a two-way TF-IDF algorithm to word/word-pair for semantic filtering. Third, it uses the K-means algorithm to merge the word pairs that have the similar semantic meaning. Experiments are carried out on the Open Movie Database (OMDb), Reuters Dataset and 20NewsGroup Dataset. The mean Average Precision score is used as the evaluation metric. Comparing our results with other state-of-the-art topic models, such as Latent Dirichlet allocation and traditional Restricted Boltzmann Machines. Our proposed data preprocessing can improve the generated topic accuracy by up to 12.99 %.

UR - http://www.scopus.com/inward/record.url?scp=85059739805&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059739805&partnerID=8YFLogxK

U2 - 10.1109/ICPR.2018.8546223

DO - 10.1109/ICPR.2018.8546223

M3 - Conference contribution

AN - SCOPUS:85059739805

VL - 2018-August

SP - 3710

EP - 3715

BT - 2018 24th International Conference on Pattern Recognition, ICPR 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -