I am back

我胡汉三又回到wordpress啦 哈哈
多年前,在对学术对科研无比热烈的岁月中,create了word process,之后被GFW成功阻拦在了围墙之内, 现在天天忙于给资本家码代码,一时兴起 也许是为了满足虚荣欲,在google scholar search了一把自己的名字,于是我们又见面了,多年不见,你可还好 word press! 五年已过,我已是另一番模样。

Posted in Uncategorized | Leave a comment

A method using proximity for pseudo-relevance feedback

Selecting Good Expansion Terms for Pseudo-Relevance Feedback.

But I think proximity is too simple,maybe some advanced approach should be studied, for example topic model.
RBF is useful in IR,but because of the existence of multi-topics in a document (some may be noisy),the traditional methods are not sufficient。

Posted in Uncategorized | Leave a comment

Latent Dirichlet Allocation


Latent Dirichlet Allocation
Probabilistic Topic Models
Latent Dirichlet Allocation2

Posted in Uncategorized | Leave a comment

Probabilistic Latent Semantic Analysis


Probabilistic Latent Semantic Indexing
Unsupervised Learning by Probabilistic Latent Semantic Analysis

Posted in Uncategorized | Leave a comment

Latent Semantic Analysis


The LSA is unable to explicitly capture multiple senses of a word, nor does it take into account that every word occurrence is typically intended to refer to only one meaning at a time. So the LSA can not address the polysem problem, but the synonym problem can be addressed by this model.
LSA
LSA2

Posted in Uncategorized | Leave a comment

Discriminative Model vs. Generative Model

Gerative Model andd Discriminative Model

Posted in Uncategorized | Leave a comment

SST and ST in Tree Kernel





Making tree kernels practical for natural language learning

Posted in Uncategorized | Leave a comment