The document summarizes two papers on language modeling techniques. [Zhang+ ACL2014] proposes applying Kneser-Ney smoothing to expected counts when training data has fractional weights, outperforming other methods on a domain adaptation task. [Pickhardt+ ACL2014] presents a generalized language model combining skipped n-grams and modified Kneser-Ney smoothing, reducing perplexity by 25.7% on small
