A hybrid word embedding model based on admixture of poisson-gamma latent dirichlet allocation model and distributed word-document- topic representation
This paper proposes a hybrid Poisson-Gamma Latent Dirichlet Allocation (PGLDA) model designed for modelling word dependencies to accommodate the semantic representation of words. The new model simultaneously overcomes the shortcomings of complexity by using LDA as the baseline model as well as adequ...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Little Lion Scientific
2020
|
Subjects: | |
Online Access: | http://eprints.uthm.edu.my/6132/1/AJ%202020%20%28203%29.pdf http://eprints.uthm.edu.my/6132/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper proposes a hybrid Poisson-Gamma Latent Dirichlet Allocation (PGLDA) model designed for modelling word dependencies to accommodate the semantic representation of words. The new model simultaneously overcomes the shortcomings of complexity by using LDA as the baseline model as well as adequately capturing the words contextual correlation. The Poisson document length distribution was replaced with the admixture of Poisson-Gamma for words correlation modelling when there is a hub word that connects words and topics. Furthermore, the distributed representation of documents (Doc2Vec) and topics (Topic2Vec) vectors are then averaged to form new vectors of words representation to be combined with topics with largest likelihood from PGLDA. Model estimation was achieved by combining the Laplacian approximation of log-likelihood for PGLDA and Feed-Forward Neural Network (FFN) approaches of Doc2Vec and Topic2Vec. The proposed hybrid method was evaluated for precision, recall, and F1 score based on 20 Newsgroups and AG’s News datasets. Comparative analysis of F1 score showed that the proposed hybrid model outperformed other methods. |
---|