Home IT Linux Windows Database Network Programming Server Mobile  
           
  Home \ Programming \ Various sorting algorithms implemented in Python     - Java executable file to read information from a database copy (Programming)

- Ubucompilator-Ubuntu, Debian, Linux Mint created deb package of graphical tools (Linux)

- PHP parsing algorithm of the interview questions (Programming)

- To install Gitolite in Ubuntu / Fedora / CentOS (Linux)

- Use ldap implement Windows Remote Desktop Ubuntu Linux (Linux)

- About AWR More Description (Database)

- The traffic monitoring system: cacti (Linux)

- Configuring a Linux operating system against syn attack (Linux)

- C language header file defines a global variable (Programming)

- C language macro definition #define Usage (Programming)

- Ubuntu install Vendetta Online 14.04 (Linux)

- Spring Boot + Nginx + Tomcat + SSL configuration notes (Server)

- Installation through the network Debian 7 (Wheezy) (Linux)

- Python extension module Ganglia 3.1.x (Linux)

- Fedora network set up simple (Linux)

- SteamOS installation under Ubuntu 14.04 (Linux)

- Spring AOP for logging (Programming)

- Ubuntu Install OpenSSL (Linux)

- Ubuntu 14.04 Solution login interface infinite loop (Linux)

- Linux firewall security (Linux)

 
         
  Various sorting algorithms implemented in Python
     
  Add Date : 2018-11-21      
         
       
         
  Merge sort

Also known merge sort Merge sort, divide and conquer is a typical application. Partition idea is to break down each question into all small problems, will solve every little problem, then merge.

Specific merge sort is, a set of random numbers by n / 2 recursively decomposed into only one child element, an element that is already sorted up. And then these sub-elements ordered merge.

The combined process is to have two sorted sequences, first select two sequences in the minimum elements of comparison, choose the smallest two elements of the sequence and add it from the sub-sequence to remove the final result focus until the two sequences merge is complete.

Code is as follows:

#! / Usr / bin / python
import sys

def merge (nums, first, middle, last):
    '' 'Merge' ''
    # Slice boundary, the left and right to open and close as the start is 0
    lnums = nums [first: middle + 1]
    rnums = nums [middle + 1: last + 1]
    lnums.append (sys.maxint)
    rnums.append (sys.maxint)
    l = 0
    r = 0
    for i in range (first, last + 1):
        if lnums [l] < rnums [r]:
            nums [i] = lnums [l]
            l + = 1
        else:
            nums [i] = rnums [r]
            r + = 1
def merge_sort (nums, first, last):
    '' 'Merge sort
    merge_sort function is passed under the standard, not the number of elements
    '' '
    if first < last:
        middle = (first + last) / 2
        merge_sort (nums, first, middle)
        merge_sort (nums, middle + 1, last)
        merge (nums, first, middle, last)

if __name__ == '__main__':
    nums = [10,8,4, -1,2,6,7,3]
    print 'nums is:', nums
    merge_sort (nums, 0, 7)
    print 'merge sort:', nums

Stable, time complexity O (nlog n)

Insertion Sort
Code is as follows:

#! / Usr / bin / python
import sys

def insert_sort (a):
    '' 'Insertion sort
    There has been a sequence of ordered data, requires good data in this row has been inserted into the sequences of a number,
    But still requires the insertion of this sequence data in order. Apparently ordered the beginning of an element, then insert a
    Elements to the appropriate location, and then insert a third element, and so on
    '' '
    a_len = len (a)
    if a_len < 2:
        return a_list
    for i in range (1, a_len):
        key = a [i]
        j = i-1
        while j> = 0 and a [j]> key:
            a [j + 1] = a [j]
            j- = 1
        a [j + 1] = key
    return a

if __name__ == '__main__':
    nums = [10,8,4, -1,2,6,7,3]
    print 'nums is:', nums
    insert_sort (nums)
    print 'insert sort:', nums

Stable, the time complexity of O (n ^ 2)

Exchange value of the two elements in python you could write: a, b = b, a, in fact, this is because the left and right sides of an assignment is the component (It should be emphasized that, in python tuple is actually made comma "," to define, rather than brackets).

Selection Sort

Selection Sort (Selection sort) is a simple and intuitive sorting algorithms. It works as follows. First, find the minimum unsorted sequence (large) elements stored in the starting position of the collating sequence, and then the remaining unsorted elements continue to look for from the smallest (large) element, and then sorted into the end of the sequence. And so on, until all the elements are sorted.

import sys
def select_sort (a):
    '' 'Selection Sort
    Each trip is selected from the data elements to be sorted in the minimum (or maximum) of an element,
    Order has been sorted on the last, until all the data elements to be sorted columns of drained.
    Selection sort is unstable sorting method.
    '' '
    a_len = len (a)
    for i in range (a_len): # at 0-n-1 Select the appropriate size of the element
        Subscript min_index = # recording the smallest element
        for j in range (i + 1, a_len): # < SPAN class = "co1"> Find the minimum value of < / SPAN>
            if (a [j] < a [min_index]):
                min_index = j
        ! If min_index = i: # find the smallest element to be exchanged
            a [i], a [min_index] = a [min_index], a [i]

if __name__ == '__main__':
    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    select_sort (A)
    print 'After sort:', A

Unstable, the time complexity of O (n ^ 2)


Shell sort
Hill sort, also known as incremental descending sorting algorithms Hill sorting non-stable sorting algorithm. This method, also known as reduced increment sort, because DL. Shell made its name in 1959.
First take a less than n is an integer d1 as the first increment, the entire log file into d1 groups. All distance d1 multiple records in the same group. First sort within each group;
Then, take the second incremental d2 < d1 repeat the grouping and sorting increments until taken dt = 1 (dt < dt-l < ... < d2 < d1), that is, all records were placed in the same group direct insertion sort so far.

import sys
def shell_sort (a):
    '' 'Shell sort
    '' '
    a_len = len (a)
    gap = a_len / 2 # increments
    while gap> 0:
        for i in range (a_len): # for the same group Selection Sort
            m = i
            j = i + 1
            while j < a_len:
                if a [j] < a [m]:
                    m = j
                j + = gap # j increase in gap
            if m = i!:
                a [m], a [i] = a [i], a [m]
        gap / = 2

if __name__ == '__main__':
    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    shell_sort (A)
    print 'After sort:', A

Unstable, the average time to time complexity O (nlogn) worst time O (n ^ s) 1 < s < 2

Heap sort (Heap Sort)

"Heap" is defined as: In the starting index of 0 "heap" in: < / span1) stack will be stored in the root position 02) left child node i in positions 2 * i + 13)
The right child of node i at position 2 * i + 24) at node i, a parent node position floor ((i - 1) / 2): Note floor indicates "rounding" operation
Heap features:
Key values for each node must always be greater than (or less than) its parent node

"Max heap":
"Reactor" is the key to save the root node with the largest. Or "stack" in the key of each node is always greater than its child nodes.
Move up, down:
When a key node is greater than its parent, then we will be "up" operation, that we have moved to the position of the node of its parent node,
And it is the parent node to its location, and we continue to determine the node until the node is no longer far greater than its parent stopped "Up."
Now let's look at the "down" operation. When we put a key node in piecemeal, and we will be "down" operation.

method:
We first establish a maximum heap (the time complexity of O (n)), then every time we just need to exchange with the root node of the last position, and then excluded from the last position, and then after an exchange of the root of the heap adjustment (time complexity O (lgn)), that is the root node "down" operation can be. The total time complexity of heap sort is O (nlgn).
Code is as follows:

#! / Usr / bin env python

# Array numbering starts at 0
def left (i):
    return 2 * i +1
def right (i):
    return 2 * i + 2

# Maintain maximum heap nature so as to i subtree rooted at the largest heap
def max_heapify (A, i, heap_size):
    if heap_size < = 0:
        return
    l = left (i)
    r = right (i)
    largest = i # selected child nodes in a larger node
    if l < heap_size and A [l]> A [largest]:
        largest = l
    if r < heap_size and A [r]> A [largest]:
        largest = r
    if i = largest:! # Description of the current node is not the biggest, down
        A [i], A [largest] = A [largest], A [i] # swap
        max_heapify (A, largest, heap_size) # continue to track down the point
    #print A
# Jian heap
def bulid_max_heap (A):
    heap_size = len (A)
    if heap_size> 1:
        node = heap_size / 2 -1
        while node> = 0:
          max_heapify (A, node, heap_size)
          node - = 1

# Heap sort index from 0
def heap_sort (A):
    bulid_max_heap (A)
    heap_size = len (A)
    i = heap_size - 1
    while i> 0:
        A [0], A [i] = A [i], A [0] # of maximum heap into an array suitable location and are exchanged
        heap_size - = 1 # heap size is decremented by one
        i - = 1 # maximum storage heap index is decremented by one
        max_heapify (A, 0, heap_size)

if __name__ == '__main__':

    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    heap_sort (A)
    print 'After sort:', A

Unstable time complexity O (nlog n)

Quick Sort
Quick sort and merge sort algorithm, is also based partition mode. Sub-array A [p ... r] a three-step process to quickly sort of partition as follows:

Decomposition: The array A [p ... r] is divided into A [p ... q-1] and A [q + 1 ... r] two parts, wherein A [p ... q-1] in each element is less than equal to a [q] and a [q + 1 ... r] each element is greater than equal to a [q];
Solution: recursive calls by quick sort, sub-array A [p ... q-1] and A [q + 1 ... r] sorting;
Merge: since the two sub-arrays are sorted in place, so no additional action.

For each round beginning in partition iteration, x = A [r], for any array subscript k, are:

1) If p<=k<=i, the A [k] <=x.
2) If i + 1<=k<=j-1, the A [k]> x.
3) If k = r, the A [k] = x.

Code is as follows:

#! / Usr / bin / env python
# Quick Sort
'' '
Divided so as to satisfy to A [r] is a reference to an array divided than A [r] small on the left,
  Than A [r] larger on the right side
Quick sort of partition partition process in two ways,
One is an index of the two pointers above one after the backward stepwise method of scanning,
Another approach is two pointer from the top to the middle of the scanning methods.
'' '
# P, r is the index of the array A
def partition1 (A, p, r):
    '' '
      Method One, two pointers one after the index scanning backward stepwise method
    '' '
    x = A [r]
    i = p-1
    j = p
    while j < r:
        if A [j] < x:
            i + = 1
            A [i], A [j] = A [j], A [i]
        j + = 1
    A [i + 1], A [r] = A [r], A [i + 1]
    return i + 1

def partition2 (A, p, r):
    '' '
    Two pointers from end to end to the inter-scan method
    '' '
    i = p
    j = r
    x = A [p]
    while i < j:
        while A [j]> = x and i < j:
            j - = 1
        A [i] = A [j]
        while A [i] < = x and i < j:
            i + = 1
        A [j] = A [i]
    A [i] = x
    return i

# Quick sort
def quick_sort (A, p, r):
    '' '
        The worst time complexity of quick sort is O (n2), the usual time complexity is O (nlgn)
    '' '
    if p < r:
        q = partition2 (A, p, r)
        quick_sort (A, p, q-1)
        quick_sort (A, q + 1, r)

if __name__ == '__main__':

    A = [5, -4,6,3,7,11,1,2]
    print 'Before sort:', A
    quick_sort (A, 0, 7)
    print 'After sort:', A

Unstable, the best time complexity O (nlogn) worst time O (n ^ 2)

He said that under the python in the sequence:
Lists, tuples and strings are sequences, but what sequences are, why they are so special? The two main features of the sequence is the index of the operator and the slicing operation. Indexing operator allows us to fetch a particular item from the sequence. Slicing operation allows us to get a slice of the sequence ie a part of the sequence, such as: a = [ 'aa', 'bb', 'cc'], print a [0] is the index operation, print a [0: 2] the slicing operations.
     
         
       
         
  More:      
 
- Modern Objective-C syntax and new features (Programming)
- Android project using the command to create and install the package (Programming)
- Android imageView in the Src and Background (Programming)
- Household use Linux Security (Linux)
- To compiler and install MariaDB-10.0.20 under CentOS 6.6 (Database)
- MyCAT log analysis (Database)
- Source code to compile and install MySQL 5.7.9 (Database)
- Android Application Development: an argument between Activity (Programming)
- Delay for the specified IP port analog network to send and receive packets on Linux (Linux)
- Autojump: an advanced cd command in the Linux file system fast navigation (Linux)
- Oracle VirtualBox Problem Solving Case (Linux)
- Linux CPU Monitoring Index (Linux)
- Thinking in Java study notes - Generics (Programming)
- Based on Python: OpenCV simple image manipulation (Programming)
- Create a DLL using MinGW and Attention (Programming)
- Use the vi text editor and copy and paste Linux tips (Linux)
- Git large file storage will help handle large binary files (Linux)
- Linux system package manager (rpm, yum, source packages installation) (Linux)
- When Linux Detailed time zone and common function of time (Linux)
- Using Lua implement various operations list (Programming)
     
           
     
  CopyRight 2002-2016 newfreesoft.com, All Rights Reserved.