码迷,mamicode.com
首页 > 其他好文 > 详细

Point-wise Mutual Information

时间:2019-08-24 15:17:39      阅读:92      评论:0      收藏:0      [点我收藏+]

标签:graph   imp   line   contain   matrix   ural   ant   network   enc   

Point-wise Mutual Information

(Yao, et al 2019) reclaimed a clear description of Point-wise Mutual Information as below:
\[ PMI(i, j) = \log \frac{p(i,j)}{p(i)p(j)} \p(i, j) = \frac{\#(i,j)}{\#W} \p(i) = \frac{\#(i)}{\#W} \]
where \(\#(i)\) is the number of sliding windows in a corpus hat contain word \(i\)

where \(\#(i,j)\) is the number of sliding windows that contain both word \(i\) and \(j\)

where \(\#W\) is the total number of sliding windows in the corpus.

(Levy, et al 2014) simplified PMI formula as below:
\[ PMI(i,j) = \log\frac{\#(i,j)\#W}{\#(i)\#(j)} \]

Obviously, \(\#W\) is a constant if we fixed slide window size and corpus, hence we can further simplify the formula as below:
\[ PMI(i, j) = \log\frac{\#(i,j)}{\#(i)\#(j)} \]

References

Liang Yao, et al, 2019. Graph Convolutional Networks for Text Classification. AAAI

Omer Levy, et al, 2014. NeuralWord Embedding as Implicit Matrix Factorization. NIPS

Point-wise Mutual Information

标签:graph   imp   line   contain   matrix   ural   ant   network   enc   

原文地址:https://www.cnblogs.com/fengyubo/p/11404515.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!