-
Recent Posts
Archives
Categories
Meta
Author Archives: LiqiangGuo
I am back
我胡汉三又回到wordpress啦 哈哈 多年前,在对学术对科研无比热烈的岁月中,create了word process,之后被GFW成功阻拦在了围墙之内, 现在天天忙于给资本家码代码,一时兴起 也许是为了满足虚荣欲,在google scholar search了一把自己的名字,于是我们又见面了,多年不见,你可还好 word press! 五年已过,我已是另一番模样。
Posted in Uncategorized
Leave a comment
A method using proximity for pseudo-relevance feedback
Selecting Good Expansion Terms for Pseudo-Relevance Feedback. But I think proximity is too simple,maybe some advanced approach should be studied, for example topic model. RBF is useful in IR,but because of the existence of multi-topics in a document (some may … Continue reading
Posted in Uncategorized
Leave a comment
Latent Dirichlet Allocation
Latent Dirichlet Allocation Probabilistic Topic Models Latent Dirichlet Allocation2
Posted in Uncategorized
Leave a comment
Probabilistic Latent Semantic Analysis
Probabilistic Latent Semantic Indexing Unsupervised Learning by Probabilistic Latent Semantic Analysis
Posted in Uncategorized
Leave a comment
Latent Semantic Analysis
The LSA is unable to explicitly capture multiple senses of a word, nor does it take into account that every word occurrence is typically intended to refer to only one meaning at a time. So the LSA can not address … Continue reading
Posted in Uncategorized
Leave a comment
Discriminative Model vs. Generative Model
Gerative Model andd Discriminative Model
Posted in Uncategorized
Leave a comment
SST and ST in Tree Kernel
Making tree kernels practical for natural language learning
Posted in Uncategorized
Leave a comment
Macro Average Precision VS. Micro Average Precision
Note:When the accounts of the returned dcoument lists belonging to different queries are the same, Macro==Micro. Macro_Micro
Posted in Uncategorized
Leave a comment
Four Methods to Estimate Query Language in IR
1.Relevance Model 2.Divergence minimizatioon model 3.simple mixture model 4.regularied mixture model A Comparative Study of Methods for Estimating Query Language Models with Pseudo Feedback
Posted in Uncategorized
Leave a comment