Sorting Algorithms Part1

来源:互联网 发布:qq第三方登录 java 编辑:程序博客网 时间:2024/05/29 11:41

In this article, I will talk about some elementary and frequently used sorting algorithms.

1.Bubble Sort

This algorithm gets its name from the way smaller elements “bubble” to the top of the list one by one.

void BubbleSort(int list[],int total){    int temp;    for(int i=total-1;i>0;i--){        for(int j=total-1;j>total-1-i;j--){            if(list[j]<list[j-1]){                temp=list[j];                list[j]=list[j-1];                list[j-1]=temp;            }        }    }}

Time complexity: O(n2)

In the above algorithm, the time complexity is O(n2) even in the best case, which is the list is sorted already. To improve that, we can add a flag to skip the passes if the list is sorted already.

Modified version:

void NewBubbleSort(int list[],int n){    int i,j,temp,flag=1;    for(i=0;i<n-1&&flag;i++){        flag=0;        for(j=0;j<n-1-i;j++){            if(list[j]>list[j+1]){                temp=list[j];                list[j]=list[j+1];                list[j+1]=temp;                flag=1;            }        }    }}

2. Selection Sort

This algorithm is called selection sort since it repeatedly selects the smallest element and swaps it with the current element (a[i]).

void SelectionSort(int a[],int n){    int i,j,min,temp;    for(i=0;i<n-1;i++){        min=i;        for(j=i+1;j<n;j++){            if(a[j]<a[min]){                min=j;            }        }        temp=a[i];        a[i]=a[min];        a[min]=temp;    }}

Time complexity: O(n2)

3. Insertion Sort

This algorithm divides the list into a sorted part and an unsorted part. At first, the sorted part only contains one element, which is the first element of the list. After that, every time we take the first element of the unsorted part and find its suitable position in the sorted part.

void InsertionSort(int a[],int n){    int i,hole,value;    for(i=1;i<n;i++){        hole=i;        value=a[i];        while(hole>0&&a[hole-1]>value){            a[hole]=a[hole-1];            hole--;        }        a[hole]=value;    }}

Time complexity: O(n2)

4. Shell Sort

This sorting algorithm is a generalization of insertion sort. Instead of comparing only the adjacent pair, shell sort makes several passes and uses various gaps between adjacent elements. In other words, it uses insertion sort repeatedly with several different gaps, when the gap is 1, it is the same as the insertion sort above.

void ShellSort(int a[],int n){    int i,j,gap,temp;    gap=n/2;    while(gap>0){        for(i=gap;i<n;i++){            temp=a[i];            for(j=i;j>=gap&&a[j-gap]>a[i];j-=gap){                a[j]=a[j-gap];            }            a[j]=temp;        }        gap=gap/2;    }}

Time complexity of this algorithm is a bit sophisticated to analyze and requires knowledge of discrete mathematics, so we won’t talk about it at this time.

5. Merge Sort

The basic idea of merge sort is to divide the list into two halves, arrange the two parts respectively and then combine them together.

void Merge(int a[],int temp[],int left,int right){    int size=right-left+1;    int mid=(left+right)/2;    int rightstart=mid+1;    int temppos=left;    while((left<=mid)&&(rightstart<=right)){        if(a[left]<a[rightstart]){            temp[temppos]=a[left];            left++;            temppos++;        }        else{            temp[temppos]=a[rightstart];            rightstart++;            temppos++;        }    }    while(left<=mid){        temp[temppos]=a[left];        left++;        temppos++;    }    while(rightstart<=right){        temp[temppos]=a[rightstart];        rightstart++;        temppos++;    }    for(int i=0;i<size;i++){                a[right]=temp[right];                right--;            }}void MergeSort(int a[],int temp[],int left,int right){    int mid;    if(right>left){        mid=(left+right)/2;        MergeSort(a,temp,left, mid);        MergeSort(a,temp,mid+1,right);        Merge(a,temp,left,right);    }}

This is a Divide and Conquer algorithm, recurrence for merge sort is: T(n)=2T(n2)+O(n). According to the master theorem for divide and conquer, we get time complexity for merge sort is: O(nlogn).

6. Quick Sort

This algorithm is an example of divide and conquer as well. The original array is partitioned into two non-empty sub arrays such that all elements in the left sub-array are smaller than the elements in the right sub array. The two sub arrays are separated by an element called pivot. After that, the two sub arrays are sorted by recursive calls of Quick Sort.

void swap(int A[],int left,int right){    int temp=A[left];    A[left]=A[right];    A[right]=temp;}int Patition(int A[],int low,int high){    int left=low;    int right=high;    while(left<right){        while(A[left]<=A[low]&&left<high){            left++;        }        while(A[right]>=A[low]&&right>low){            right--;        }        if(left<right){            swap(A,left,right);        }    }    swap(A,low,right);    return right;}void QuickSort(int A[],int left,int right){    int pivot;    if(right>left){        pivot=Patition(A,left,right);        QuickSort(A,left,pivot-1);        QuickSort(A,pivot+1,right);    }}

Time complexity: Best case: O(nlogn). Worst case: O(n2). Average case: O(nlogn).
We can improve this method by choosing the pivot randomly to avoid the worst case, which just requires a few changes in the Patition function.

1 0
原创粉丝点击