Shannon entropy
来源:互联网 发布:mac sass安装失败 编辑:程序博客网 时间:2024/06/05 02:27
Shannon entropy is one of the most important metrics in information theory. Entropy measures the uncertainty associated with a random variable, i.e. the expected value of the information in the message (in classical informatics it is measured in bits).
The concept was introduced by Claude E. Shannon in the paper „A Mathematical Theory of Communication” (1948). Shannon entropy allows to estimate the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.
The Shannon entropy is calculated using formula:
当b=2时,H(X)就表示这个变量可以用几个bit来表示。bit就是H(X)的单位。如一个变量表示掷硬币正反面,如果正反面的概率都是1/2,那么H(X)就为1,1个bit就可以表示这个变量。
0 0
- Shannon entropy
- 香农熵理论 Shannon Entropy
- shannon
- 机器学习入门:重要的概念---信息熵(Shannon’s Entropy Model)
- 机器学习基础(五十)—— Gini Impurity(基尼不纯度)与香浓熵(Shannon Entropy))
- Entropy
- Entropy
- Entropy
- Entropy
- shannon编码
- shannon编码
- 信道容量与Shannon公式
- The Shannon Limit
- Jensen-Shannon散度
- 香农编码Shannon
- Shannon 信息熵:python_code
- 熵 (Entropy)
- 熵 (Entropy)
- 线程的同步synchronized
- MFC+openCV学习
- BOJ 395 Tree
- WiFi共享精灵文件传输新功能:金币话费领不停
- 内存池
- Shannon entropy
- PAT循环-11. 水仙花数(20)
- 跨进程 获取 syslistview32 内容
- 初步理解Android的MediaScanner(2)
- 华为招聘机试整理6:选秀节目打分
- IO流练习
- 建立SIP软电话环境
- JSON和Object数组在js中的转换
- java接口实现