Monthly Archives: April 2011

Zipf’s law

Zipf’s Law

Posted in Uncategorized | Leave a comment

Four Methods to Estimate Query Language in IR

1.Relevance Model 2.Divergence minimizatioon model 3.simple mixture model 4.regularied mixture model A Comparative Study of Methods for Estimating Query Language Models with Pseudo Feedback

Posted in Uncategorized | Leave a comment

MEMM v.s. CRF

The advantage of CRF is that CRF resolve the label bias problem which can be happened in the MEMM model by global normalization. Conditional Random Fields Probabilistic Models for Segmenting and Labeling Sequence Data

Posted in Uncategorized | Leave a comment

The Label Bias Problem

“This per-state normalization of transition scores implies a “conservation of score mass” (Bottou, 1991) whereby all the mass that arrives at a state must be distributed among the possible successor states. An observation can affect which destination states get the … Continue reading

Posted in Uncategorized | Leave a comment

HMM v.s. MEMM

HMM is a useful model with a long history which has been used in many domains. MEMM is a new model whih is inspired with HMM and Maximum Entropy theory.This model is more feasible than HMM.It can incorporate many features … Continue reading

Posted in Uncategorized | Leave a comment

The Maximum Entropy Model

A useful web site:http://homepages.inf.ed.ac.uk/lzhang10/maxent.html A Maximum Entropy Approach to Natural Language Processing Lagrangian duality and algorithms for the Lagrangian dual problem

Posted in Uncategorized | Leave a comment

The Basic Model of EM Algorithm

The EM algorithm is a general model for maximum-likelihood estimation estimation where the data are “incomplete” or the likelihood function involes latent varibles. The Basic Model of EM Algorithm A Note on the Expectation-Maximization (EM) Algorithm

Posted in Uncategorized | Leave a comment