(CodeForces

来源:互联网 发布:csgo和cf知乎 编辑:程序博客网 时间:2024/05/17 17:41

(CodeForces - 609C)Load Balancing

time limit per test:2 seconds
memory limit per test:256 megabytes
input:standard input
output:standard output

In the school computer room there are n servers which are responsible for processing several computing tasks. You know the number of scheduled tasks for each server: there are mi tasks assigned to the i-th server.

In order to balance the load for each server, you want to reassign some tasks to make the difference between the most loaded server and the least loaded server as small as possible. In other words you want to minimize expression ma - mb, where a is the most loaded server and b is the least loaded one.

In one second you can reassign a single task. Thus in one second you can choose any pair of servers and move a single task from one server to another.

Write a program to find the minimum number of seconds needed to balance the load of servers.

Input

The first line contains positive number n (1 ≤ n ≤ 105) — the number of the servers.

The second line contains the sequence of non-negative integers m1, m2, …, mn (0 ≤ mi ≤ 2·104), where mi is the number of tasks assigned to the i-th server.

Output

Print the minimum number of seconds required to balance the load.

Examples

Input

2
1 6

Output

2

Input

7
10 11 10 11 10 11 11

Output

0

Input

5
1 2 3 4 5

Output

3

Note

In the first example two seconds are needed. In each second, a single task from server #2 should be moved to server #1. After two seconds there should be 3 tasks on server #1 and 4 tasks on server #2.

In the second example the load is already balanced.

A possible sequence of task movements for the third example is:
1. move a task from server #4 to server #1 (the sequence m becomes: 2 2 3 3 5);
2. then move task from server #5 to server #1 (the sequence m becomes: 3 2 3 3 4);
3. then move task from server #5 to server #2 (the sequence m becomes: 3 3 3 3 3).

The above sequence is one of several possible ways to balance the load of servers in three seconds.

题目大意:有n个数,每一次操作可以将任意一个数减1加到其他数上,现在要是操作之后的数中最大值和最小值的差最小,问最少需要操作几次。

思路:设sum是n个数的和,则整数平均值ave=sum/n,余数rem=n-n/sum。要满足题意的话,最后剩下的数必定是sum/n-rem个ave和rem个ave+1,然后就可以对每个数需要进行的操作进行统计。

#include<cstdio>#include<algorithm>using namespace std;const int maxn=100005;int a[maxn];int main(){    int n;    while(~scanf("%d",&n))    {        int sum=0;        for(int i=0;i<n;i++)        {            scanf("%d",a+i);            sum+=a[i];        }        int lo,hi;        if(sum%n)        {            lo=sum/n;            hi=lo+1;        }        else        {            lo=hi=sum/n;        }        int ans1=0,ans2=0;        for(int i=0;i<n;i++)        {            if(a[i]<lo) ans1+=lo-a[i];            if(a[i]>hi) ans2+=a[i]-hi;        }           printf("%d\n",max(ans1,ans2));    }    return 0;}
原创粉丝点击