Home IT Linux Windows Database Network Programming Server Mobile  
  Home \ Programming \ Various sorting algorithms implemented in Python     - Redis is installed in Ubuntu 14.04 (Database)

- Linux initialization init systems - Systemd (Linux)

- How to use the beta / unstable version of the software in Debian library (Linux)

- Linux how to view the graphics models notebook (Linux)

- Installation CD audio file extraction tool Flacon (Linux)

- APF firewall installation and configuration under Linux (Linux)

- Python method to traverse folders and file read and write (Programming)

- Golang environment configuration recommendations (Linux)

- Binary tree and some basic operations with binary list (Programming)

- Detailed iptables (Linux)

- Java logging performance of those things (Programming)

- CentOS minimal network is unavailable resolved (Linux)

- Repair Raspbian Encountered a section with no Package (Linux)

- How to install Visual Studio Code on Ubuntu (Linux)

- Linux compiler of GCC (Linux)

- Java class loading order (Programming)

- Linux disk virtualization (Linux)

- CentOS iptables firewall enabled (Linux)

- Stucts2 values on the page and how the attribute values corresponding to the Action (Programming)

- Ubuntu iptables prevent IP attacks (Linux)

  Various sorting algorithms implemented in Python
  Add Date : 2018-11-21      
  Merge sort

Also known merge sort Merge sort, divide and conquer is a typical application. Partition idea is to break down each question into all small problems, will solve every little problem, then merge.

Specific merge sort is, a set of random numbers by n / 2 recursively decomposed into only one child element, an element that is already sorted up. And then these sub-elements ordered merge.

The combined process is to have two sorted sequences, first select two sequences in the minimum elements of comparison, choose the smallest two elements of the sequence and add it from the sub-sequence to remove the final result focus until the two sequences merge is complete.

Code is as follows:

#! / Usr / bin / python
import sys

def merge (nums, first, middle, last):
    '' 'Merge' ''
    # Slice boundary, the left and right to open and close as the start is 0
    lnums = nums [first: middle + 1]
    rnums = nums [middle + 1: last + 1]
    lnums.append (sys.maxint)
    rnums.append (sys.maxint)
    l = 0
    r = 0
    for i in range (first, last + 1):
        if lnums [l] < rnums [r]:
            nums [i] = lnums [l]
            l + = 1
            nums [i] = rnums [r]
            r + = 1
def merge_sort (nums, first, last):
    '' 'Merge sort
    merge_sort function is passed under the standard, not the number of elements
    '' '
    if first < last:
        middle = (first + last) / 2
        merge_sort (nums, first, middle)
        merge_sort (nums, middle + 1, last)
        merge (nums, first, middle, last)

if __name__ == '__main__':
    nums = [10,8,4, -1,2,6,7,3]
    print 'nums is:', nums
    merge_sort (nums, 0, 7)
    print 'merge sort:', nums

Stable, time complexity O (nlog n)

Insertion Sort
Code is as follows:

#! / Usr / bin / python
import sys

def insert_sort (a):
    '' 'Insertion sort
    There has been a sequence of ordered data, requires good data in this row has been inserted into the sequences of a number,
    But still requires the insertion of this sequence data in order. Apparently ordered the beginning of an element, then insert a
    Elements to the appropriate location, and then insert a third element, and so on
    '' '
    a_len = len (a)
    if a_len < 2:
        return a_list
    for i in range (1, a_len):
        key = a [i]
        j = i-1
        while j> = 0 and a [j]> key:
            a [j + 1] = a [j]
            j- = 1
        a [j + 1] = key
    return a

if __name__ == '__main__':
    nums = [10,8,4, -1,2,6,7,3]
    print 'nums is:', nums
    insert_sort (nums)
    print 'insert sort:', nums

Stable, the time complexity of O (n ^ 2)

Exchange value of the two elements in python you could write: a, b = b, a, in fact, this is because the left and right sides of an assignment is the component (It should be emphasized that, in python tuple is actually made comma "," to define, rather than brackets).

Selection Sort

Selection Sort (Selection sort) is a simple and intuitive sorting algorithms. It works as follows. First, find the minimum unsorted sequence (large) elements stored in the starting position of the collating sequence, and then the remaining unsorted elements continue to look for from the smallest (large) element, and then sorted into the end of the sequence. And so on, until all the elements are sorted.

import sys
def select_sort (a):
    '' 'Selection Sort
    Each trip is selected from the data elements to be sorted in the minimum (or maximum) of an element,
    Order has been sorted on the last, until all the data elements to be sorted columns of drained.
    Selection sort is unstable sorting method.
    '' '
    a_len = len (a)
    for i in range (a_len): # at 0-n-1 Select the appropriate size of the element
        Subscript min_index = # recording the smallest element
        for j in range (i + 1, a_len): # < SPAN class = "co1"> Find the minimum value of < / SPAN>
            if (a [j] < a [min_index]):
                min_index = j
        ! If min_index = i: # find the smallest element to be exchanged
            a [i], a [min_index] = a [min_index], a [i]

if __name__ == '__main__':
    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    select_sort (A)
    print 'After sort:', A

Unstable, the time complexity of O (n ^ 2)

Shell sort
Hill sort, also known as incremental descending sorting algorithms Hill sorting non-stable sorting algorithm. This method, also known as reduced increment sort, because DL. Shell made its name in 1959.
First take a less than n is an integer d1 as the first increment, the entire log file into d1 groups. All distance d1 multiple records in the same group. First sort within each group;
Then, take the second incremental d2 < d1 repeat the grouping and sorting increments until taken dt = 1 (dt < dt-l < ... < d2 < d1), that is, all records were placed in the same group direct insertion sort so far.

import sys
def shell_sort (a):
    '' 'Shell sort
    '' '
    a_len = len (a)
    gap = a_len / 2 # increments
    while gap> 0:
        for i in range (a_len): # for the same group Selection Sort
            m = i
            j = i + 1
            while j < a_len:
                if a [j] < a [m]:
                    m = j
                j + = gap # j increase in gap
            if m = i!:
                a [m], a [i] = a [i], a [m]
        gap / = 2

if __name__ == '__main__':
    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    shell_sort (A)
    print 'After sort:', A

Unstable, the average time to time complexity O (nlogn) worst time O (n ^ s) 1 < s < 2

Heap sort (Heap Sort)

"Heap" is defined as: In the starting index of 0 "heap" in: < / span1) stack will be stored in the root position 02) left child node i in positions 2 * i + 13)
The right child of node i at position 2 * i + 24) at node i, a parent node position floor ((i - 1) / 2): Note floor indicates "rounding" operation
Heap features:
Key values for each node must always be greater than (or less than) its parent node

"Max heap":
"Reactor" is the key to save the root node with the largest. Or "stack" in the key of each node is always greater than its child nodes.
Move up, down:
When a key node is greater than its parent, then we will be "up" operation, that we have moved to the position of the node of its parent node,
And it is the parent node to its location, and we continue to determine the node until the node is no longer far greater than its parent stopped "Up."
Now let's look at the "down" operation. When we put a key node in piecemeal, and we will be "down" operation.

We first establish a maximum heap (the time complexity of O (n)), then every time we just need to exchange with the root node of the last position, and then excluded from the last position, and then after an exchange of the root of the heap adjustment (time complexity O (lgn)), that is the root node "down" operation can be. The total time complexity of heap sort is O (nlgn).
Code is as follows:

#! / Usr / bin env python

# Array numbering starts at 0
def left (i):
    return 2 * i +1
def right (i):
    return 2 * i + 2

# Maintain maximum heap nature so as to i subtree rooted at the largest heap
def max_heapify (A, i, heap_size):
    if heap_size < = 0:
    l = left (i)
    r = right (i)
    largest = i # selected child nodes in a larger node
    if l < heap_size and A [l]> A [largest]:
        largest = l
    if r < heap_size and A [r]> A [largest]:
        largest = r
    if i = largest:! # Description of the current node is not the biggest, down
        A [i], A [largest] = A [largest], A [i] # swap
        max_heapify (A, largest, heap_size) # continue to track down the point
    #print A
# Jian heap
def bulid_max_heap (A):
    heap_size = len (A)
    if heap_size> 1:
        node = heap_size / 2 -1
        while node> = 0:
          max_heapify (A, node, heap_size)
          node - = 1

# Heap sort index from 0
def heap_sort (A):
    bulid_max_heap (A)
    heap_size = len (A)
    i = heap_size - 1
    while i> 0:
        A [0], A [i] = A [i], A [0] # of maximum heap into an array suitable location and are exchanged
        heap_size - = 1 # heap size is decremented by one
        i - = 1 # maximum storage heap index is decremented by one
        max_heapify (A, 0, heap_size)

if __name__ == '__main__':

    A = [10, -3, 5, 7, 1, 3, 7]
    print 'Before sort:', A
    heap_sort (A)
    print 'After sort:', A

Unstable time complexity O (nlog n)

Quick Sort
Quick sort and merge sort algorithm, is also based partition mode. Sub-array A [p ... r] a three-step process to quickly sort of partition as follows:

Decomposition: The array A [p ... r] is divided into A [p ... q-1] and A [q + 1 ... r] two parts, wherein A [p ... q-1] in each element is less than equal to a [q] and a [q + 1 ... r] each element is greater than equal to a [q];
Solution: recursive calls by quick sort, sub-array A [p ... q-1] and A [q + 1 ... r] sorting;
Merge: since the two sub-arrays are sorted in place, so no additional action.

For each round beginning in partition iteration, x = A [r], for any array subscript k, are:

1) If p<=k<=i, the A [k] <=x.
2) If i + 1<=k<=j-1, the A [k]> x.
3) If k = r, the A [k] = x.

Code is as follows:

#! / Usr / bin / env python
# Quick Sort
'' '
Divided so as to satisfy to A [r] is a reference to an array divided than A [r] small on the left,
  Than A [r] larger on the right side
Quick sort of partition partition process in two ways,
One is an index of the two pointers above one after the backward stepwise method of scanning,
Another approach is two pointer from the top to the middle of the scanning methods.
'' '
# P, r is the index of the array A
def partition1 (A, p, r):
    '' '
      Method One, two pointers one after the index scanning backward stepwise method
    '' '
    x = A [r]
    i = p-1
    j = p
    while j < r:
        if A [j] < x:
            i + = 1
            A [i], A [j] = A [j], A [i]
        j + = 1
    A [i + 1], A [r] = A [r], A [i + 1]
    return i + 1

def partition2 (A, p, r):
    '' '
    Two pointers from end to end to the inter-scan method
    '' '
    i = p
    j = r
    x = A [p]
    while i < j:
        while A [j]> = x and i < j:
            j - = 1
        A [i] = A [j]
        while A [i] < = x and i < j:
            i + = 1
        A [j] = A [i]
    A [i] = x
    return i

# Quick sort
def quick_sort (A, p, r):
    '' '
        The worst time complexity of quick sort is O (n2), the usual time complexity is O (nlgn)
    '' '
    if p < r:
        q = partition2 (A, p, r)
        quick_sort (A, p, q-1)
        quick_sort (A, q + 1, r)

if __name__ == '__main__':

    A = [5, -4,6,3,7,11,1,2]
    print 'Before sort:', A
    quick_sort (A, 0, 7)
    print 'After sort:', A

Unstable, the best time complexity O (nlogn) worst time O (n ^ 2)

He said that under the python in the sequence:
Lists, tuples and strings are sequences, but what sequences are, why they are so special? The two main features of the sequence is the index of the operator and the slicing operation. Indexing operator allows us to fetch a particular item from the sequence. Slicing operation allows us to get a slice of the sequence ie a part of the sequence, such as: a = [ 'aa', 'bb', 'cc'], print a [0] is the index operation, print a [0: 2] the slicing operations.
- Improve the Ubuntu SSH login authentication approach speed (Linux)
- Unsafe reboot start (Linux)
- MySQL + Corosync + Pacemaker + DRBD build highly available MySQL (Server)
- Linux text processing tool of sed (Linux)
- Udev: Device Manager for Linux Fundamentals (Linux)
- 25 Git Usage Tips (Linux)
- Linux file system data file deletion problem space is not freed (Database)
- Linux environment variable configuration and save places (Linux)
- Git build a team development environment operating drills (Linux)
- Java developers question (Programming)
- ORA-00845: MEMORY_TARGET not supported on this system Problem (Database)
- AngularJS notes --- Scope and controller (Programming)
- Java reflection technology explain (Programming)
- Turning off the interface eth0: error: Disconnect the device 'eth0' (Linux)
- Transplant spider to MySQL 5.6 (Database)
- Analysis: Little Notebook facing a major security threat secure online (Linux)
- GitLab Guide installation under Ubuntu 14.04 (Server)
- Linux System Getting Started Tutorial: How to automatically set the JAVA_HOME environment variable on Linux (Linux)
- Linux LVM - File system extension (Linux)
- Export error: ORA-04063: package body dmsys dbms dm model exp has error (Database)
  CopyRight 2002-2016 newfreesoft.com, All Rights Reserved.