Difference between Cross Entropy and Joint Entropy

来源:互联网 发布:鉴别手机性能软件 编辑:程序博客网 时间:2024/05/17 02:22
Even they are using same notation, but have different meaning.


Cross Entropy:
x ~ p, y ~ q


H(p,q) = - Sum( p(x)*log(q(y)) )


Joint Entropy:
x ~ p, y ~ q, (x,y) ~ f
H(p,q) = - sum( f(x,y) * log( f(x,y)) )


There a picture can help understanding:



0 0