Home IT Linux Windows Database Network Programming Server Mobile  
           
  Home \ Programming \ Python KNN algorithm of actual realization     - Samba file sharing server set up (Server)

- lack of SWAP space during installation of Oracle (Database)

- How to use systemd timer (Linux)

- Talk about the Linux ABI compatibility Application (Linux)

- Linux Beginner Guide: Installing packages on Ubuntu and Fedora (Linux)

- CoreCLR compiled in Linux CentOS (Linux)

- Linux Log File Browser --logrotate (Linux)

- Ubuntu Apache2 setting, problem solving css, pictures, etc. can not be displayed (Server)

- Analysis of potential problems through custom Orabbix monitoring Oracle (Database)

- How to properly set up a Linux swap partition (Linux)

- using Docker Kitematic on windows (Linux)

- Oracle: RETURNING clause (Database)

- Mysql binlog resolve the garbage problem decryption (Database)

- Use Makeself Create installation file (Linux)

- Construction CA certificate using OpenSSL command line (Server)

- Mount NFS network file system (Linux)

- Linux cut Command Study Notes (Linux)

- Installation and management of Linux applications (Linux)

- 10046 trace only open for a particular SQL statement (Database)

- 30 minutes with your Quick Start MySQL Tutorial (Database)

 
         
  Python KNN algorithm of actual realization
     
  Add Date : 2017-01-08      
         
       
         
  Use Python to achieve K nearest neighbor classification algorithm (KNN) is already a commonplace problem, there are already a lot of information online, but here I decided to record your own learning experience.

1, the configuration numpy library

numpy Python libraries are third-party libraries for matrix operations, most will rely on the math libraries for about numpy library configuration see: Python Configuration tortuous road of third-party libraries and matplotlib Numpy, configuration, when completed, will numpy library overall imported into the current project.

2. Preparation of training samples

Here a simple structure with four points and the corresponding label as training samples of KNN:

# Create training samples ==================== ====================
def createdataset ():
    group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
    labels = [ 'A', 'B', 'C', 'D']
    return group, labels
There is a small detail, is through the array () function to initialize the old structure and numpy array objects when you want to ensure that only one parameter, so you need to code parameters in brackets, like this is not a legitimate way to call of:

group = array ([1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1])
3. Create a classification function

K-nearest neighbor algorithm for classification is usually classified according to Euclidean distance, data and training input data needs to be reduced further in all dimensions relative to the square sum, before you prescribe, as follows:

# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
    DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
    diffmat = tile (Inx, (DataSetSize, 1)) - Dataset
    SqDiffMat = diffmat ** 2
    SqDistances = SqDiffMat.sum (axis = 1)
    Distance = SqDistances ** 0.5
    SortedDistanceIndicies = Distance.argsort ()
    ClassCount = {}
Here tile () function is a numpy matrix extension functions, such as training samples in this example there are four two-dimensional coordinate point, the input sample (a two-dimensional coordinate point), it needs to be extended to a row 1 of 4 matrix, and then performing matrix subtraction, summation law in the flat, and then the square root of the distance calculation. After computation of the distance, calling Matrix object sorting member function argsort () distance in ascending order. Here introduce a Pycharm View source life tips: join in the preparation of this program is that we are not sure argsort () whether the array object member function, we select this function and right -> Go to -> Declaration, so jumps to argsort () function declaration code sheet, by looking at the code can be confirmed affiliation does include this array class member function calls no problem

After the sort of distance, the next step according to the first K values corresponding to the minimum distance label to determine the current sample belongs to which category:

    for i in range (k):
        VoteiLabel = labels [SortedDistanceIndicies [i]]
        ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
    SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
There is a small problem is to obtain in Python2 the dictionary elements using dict.iteritems () member function, and instead dict.items () function in the Python3. "Key = operator.itemgetter (1)" means that the specified function for the second dimension dictionary sort the elements, attention needed here before you import the symbol library operator. Here is the number of times by the value of each type of label that appears before recording the lowest K distances judgment attributable to the test sample.

4, the test

Here is the complete KNN test code:

# Coding: utf-8
from numpy import *
import operator


# Create training samples ==================== ====================
def createdataset ():
    group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
    labels = [ 'A', 'B', 'C', 'D']
    return group, labels

# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
    DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
    diffmat = tile (Inx, (DataSetSize, 1)) - Dataset
    SqDiffMat = diffmat ** 2
    SqDistances = SqDiffMat.sum (axis = 1)
    Distance = SqDistances ** 0.5
    SortedDistanceIndicies = Distance.argsort ()
    ClassCount = {}
    for i in range (k):
        VoteiLabel = labels [SortedDistanceIndicies [i]]
        ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
    SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
    return SortedClassCount [0] [0]

Groups, Labels = createdataset ()
Result = classify ([0, 0], Groups, Labels, 1)
print (Result)
Run the code, the program promised result "C". It should mention that is for a single training sample (each class has only one training sample) the classification, K KNN value should be set to 1.
     
         
       
         
  More:      
 
- Java-- get the reflection object information (Programming)
- MySQL view (Database)
- OpenSSL to generate public and private key (Linux)
- Linux system Iptables Firewall User Manual (Linux)
- Linux Log File Browser --logrotate (Linux)
- Linux screen commonly commands (Linux)
- Linux kernel modules related to the management Comments (Linux)
- VPN built on CentOS (Server)
- Upgrade to Linux Mint 16 petra Mint 17 Qiana (Linux)
- Python implementation Bursa transition model (Programming)
- Linux installed PCRE (Linux)
- iOS development -Launch Image and Launchscreen (Programming)
- Ubuntu 14.10 installation GNOME 3.14 (Linux)
- Analysis of MySQL Dockerfile 5.6 (Database)
- JavaScript event handling Detailed (Programming)
- The easiest 11g Active DataGuard (ADG) to establish the configuration process (Database)
- Getting case of Python Hello World (Programming)
- Install KVM on Ubuntu and build a virtual environment (Linux)
- How to use Quagga BGP (Border Gateway Protocol) router to filter BGP routing (Linux)
- Seven kinds of DDoS attack methods and techniques Defensive (Linux)
     
           
     
  CopyRight 2002-2016 newfreesoft.com, All Rights Reserved.