首页 > 其他 > 详细

knowledge_based topic model KBTM

时间:2015-03-03 18:37:32      阅读:355      评论:0      收藏:0      [点我收藏+]

http://blog.csdn.net/pipisorry/article/details/44040701

术语

Mustlink states that two words should belong to the same topic
Cannot-link states that two words should not belong to the same topic.

DF-LDA

is perhaps the earliest KBTM, which can incorporate two forms of prior knowledge from the user: must-links and cannot-links.

[Andrzejewski, David, Zhu, Xiaojin, and Craven, Mark. Incorporating domain knowledge into topic modeling via Dirichlet Forest priors. In ICML, pp. 25–32, 2009.]


DF-LDA [1]: A knowledge-based topic model that can use both must-links and cannot-links, but it assumes all the knowledge is correct.
MC-LDA [10]: A knowledge-based topic model that also use both the must-link and the cannot-link knowledge. It assumes that all knowledge is correct as well.
GK-LDA [9]: A knowledge-based topic model that uses the ratio of word probabilities under each topic to reduce the effect of wrong knowledge. However, it can only use the must-link type of knowledge.
LTM [7]: A lifelong learning topic model that learns only the must-link type of knowledge automatically. It outperformed [8].


from:http://blog.csdn.net/pipisorry/article/details/44040701

knowledge_based topic model KBTM

原文:http://blog.csdn.net/pipisorry/article/details/44040701

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!