ZOJ 3827 简单数学推导+简单模拟
来源:互联网 发布:js防水涂料设备 编辑:程序博客网 时间:2024/06/16 01:06
题目:
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream. Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2, ..., xn} and probability mass function P(X) as:
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability of the i-th value in percentage and the sum of Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
33 bit25 25 507 nat1 2 4 8 16 32 3710 dit10 10 10 10 10 10 10 10 10 10
Sample Output
1.5000000000001.4808108324651.000000000000
其实有用的只有这一个式子:
H(X)=−∑i=1nP(xi)log b(P(xi))
然后
0log b(0)=limp→0+plog b(p) 可以把p写成1/p写成分母 p趋于0时分子分母都趋于无穷,使用洛必达法则,去掉极限符号同时分子分母求导 可以得出式子为零
代码:
#include <cstdio>#include <algorithm>#include <cmath>#include <iostream>#include <cstring>//#include <iomanip>#include<bits/stdc++.h>//zoj支持 注意提交得时候选C++而非cusing namespace std;typedef long long ll;const int mod = 1000000007;const int maxn = 120;const double e1=2.000000000;const double e2=2.718281828459;const double e3=10.000000000;double lo;int t,n;double a[maxn];string s;int main(){ ios_base::sync_with_stdio(false); cin>>t; while(t--){ cin>>n>>s; if(s=="bit") lo=e1; else if(s=="nat") lo=e2; else if(s=="dit") lo=e3; //double tt=1.0/(n*1.0); double ans=0.0; for(int i=1;i<=n;++i){ cin>>a[i]; if(a[i]==0) continue; a[i]/=100.0; ans+=a[i]*log(a[i])/log(lo); } cout<<fixed<<setprecision(12)<<-ans<<endl; } return 0;}
- ZOJ 3827 简单数学推导+简单模拟
- ZOJ Fiddlesticks (简单模拟)
- [简单数学]ZOJ 2969 Easy Task
- 【简单数学】ZOJ 2975 Kinds of Fuwas
- zoj 3203 Light Bulb(公式推导|三分法)(简单)
- ACM: 简单数学推导+挑战平台的精度…
- Machine Learning:最小二乘法数学原理及简单推导
- ZOJ-3594 Sexagenary Cycle【简单模拟】
- zoj 3697 恶心模拟 +简单DP
- ZOJ 3220 Killing Streak(简单模拟)
- Codefroces 32C (简单模拟+数学)
- zoj1494 暴力模拟 简单数学问题
- ZOJ 3203 Light Bulb【数学推导&三分】
- UVALive 6835 (简单推导)
- 由简单推导复杂
- 简单推导 PCA
- Slice Sampling 简单推导
- SVM简单推导
- Scrapy网络爬虫----初识
- 二叉树层次遍历-LintCode
- 关于C语言中十六进制移位问题
- [poj1061]: 青蛙的约会(扩展欧几里得)
- 解决win10 VC++6.0 应用程序无法正常运行 0xc0000142
- ZOJ 3827 简单数学推导+简单模拟
- 第二章 使用Bootstrap的准备-HelloWorld 笔记2
- HDU
- 【PB】程序中连接数据库
- Java第一天,命名规范。
- html的mate标记
- Redis安装报错 error: jemalloc/jemalloc.h: No such file or directory解决方法
- 【网易2017实习生编程题】分饼干
- 好的学习网站收藏