Difference between Cross Entropy and Joint Entropy
来源:互联网 发布:鉴别手机性能软件 编辑:程序博客网 时间:2024/05/17 02:22
Even they are using same notation, but have different meaning.
Cross Entropy:
x ~ p, y ~ q
H(p,q) = - Sum( p(x)*log(q(y)) )
Joint Entropy:
x ~ p, y ~ q, (x,y) ~ f
H(p,q) = - sum( f(x,y) * log( f(x,y)) )
Cross Entropy:
x ~ p, y ~ q
H(p,q) = - Sum( p(x)*log(q(y)) )
Joint Entropy:
x ~ p, y ~ q, (x,y) ~ f
H(p,q) = - sum( f(x,y) * log( f(x,y)) )
There a picture can help understanding:
0 0
- Difference between Cross Entropy and Joint Entropy
- Cross entropy
- cross-entropy
- cross entropy
- What are the differences between maximum likelihood and minimize cross entropy loss function?
- Cross Entropy and Perplexity in NLP
- Cross Entropy Error Function
- Cross-Entropy 交叉熵
- Cross-entropy cost
- 分类loss cross-entropy
- cross-entropy函数
- cross entropy的梯度
- Entropy
- Entropy
- Entropy
- Entropy
- Multinomial Logistic Loss and Cross Entropy Loss are the same
- 交叉熵在Cross-entropy error function and logistic regression
- 如何让你的游戏更有趣
- (C language Sample ) Compile procedure
- 日常整理oracel sql
- wince SQL 学习
- Android文件系统移植
- Difference between Cross Entropy and Joint Entropy
- 制约个人成长的15种能力
- Web攻防系列教程之 Cookie注入攻防实战
- mysql集群 配置Keepalived+mm
- 构建一个datatable,把datatable作为数据源绑定
- libevent使用经验
- GetOpenFilename的使用方法和示例
- 远嫁他乡,多年后你会明白,你失去的是什么
- 浙大ZOJ 1009 Enigma问题解决及别人的解决方案