Home PC Games Linux Windows Database Network Programming Server Mobile  
           
  Home \ Programming \ Python KNN algorithm of actual realization     - Detailed software to run UnixBench (Linux)

- Linux kernel boot to retain large memory method summary (Linux)

- Large computer network security policy Experience (Linux)

- Ubuntu 14.04 to install file editor KKEdit 0.1.5 version (Linux)

- Let Mac OS X dedicated high-speed mobile hard disk can also be read in Linux (Linux)

- 7 extremely dangerous Linux commands (Linux)

- zBackup: A versatile tool to remove duplicate backup (Linux)

- C ++ casts New Standard Comments (Programming)

- UNIX and class UNIX system security check notes (Linux)

- Linux Demo dd IO test (Linux)

- CentOS 6.5 installation and deployment SVN 1.8.10 (Linux)

- Linux network security strategy (Linux)

- RedHat 6.5 installation and deployment Openfire (Server)

- VMware virtual machine to install CentOS 7 (Linux)

- How to add and delete bookmarks in Ubuntu (Linux)

- Ubuntu update bug fixes Daquan (Linux)

- Linux development environment to build and use the directory structure and file --Linux (Linux)

- Linux Command-line MySQL summary (Database)

- Ora-00600 [fast hot Atkins soft _ that _ Oh, of course not _less_ profile] (Database)

- How to manage the time and date at systemd Linux systems (Linux)

 
         
  Python KNN algorithm of actual realization
     
  Add Date : 2017-01-08      
         
         
         
  Use Python to achieve K nearest neighbor classification algorithm (KNN) is already a commonplace problem, there are already a lot of information online, but here I decided to record your own learning experience.

1, the configuration numpy library

numpy Python libraries are third-party libraries for matrix operations, most will rely on the math libraries for about numpy library configuration see: Python Configuration tortuous road of third-party libraries and matplotlib Numpy, configuration, when completed, will numpy library overall imported into the current project.

2. Preparation of training samples

Here a simple structure with four points and the corresponding label as training samples of KNN:

# Create training samples ==================== ====================
def createdataset ():
    group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
    labels = [ 'A', 'B', 'C', 'D']
    return group, labels
There is a small detail, is through the array () function to initialize the old structure and numpy array objects when you want to ensure that only one parameter, so you need to code parameters in brackets, like this is not a legitimate way to call of:

group = array ([1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1])
3. Create a classification function

K-nearest neighbor algorithm for classification is usually classified according to Euclidean distance, data and training input data needs to be reduced further in all dimensions relative to the square sum, before you prescribe, as follows:

# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
    DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
    diffmat = tile (Inx, (DataSetSize, 1)) - Dataset
    SqDiffMat = diffmat ** 2
    SqDistances = SqDiffMat.sum (axis = 1)
    Distance = SqDistances ** 0.5
    SortedDistanceIndicies = Distance.argsort ()
    ClassCount = {}
Here tile () function is a numpy matrix extension functions, such as training samples in this example there are four two-dimensional coordinate point, the input sample (a two-dimensional coordinate point), it needs to be extended to a row 1 of 4 matrix, and then performing matrix subtraction, summation law in the flat, and then the square root of the distance calculation. After computation of the distance, calling Matrix object sorting member function argsort () distance in ascending order. Here introduce a Pycharm View source life tips: join in the preparation of this program is that we are not sure argsort () whether the array object member function, we select this function and right -> Go to -> Declaration, so jumps to argsort () function declaration code sheet, by looking at the code can be confirmed affiliation does include this array class member function calls no problem

After the sort of distance, the next step according to the first K values corresponding to the minimum distance label to determine the current sample belongs to which category:

    for i in range (k):
        VoteiLabel = labels [SortedDistanceIndicies [i]]
        ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
    SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
There is a small problem is to obtain in Python2 the dictionary elements using dict.iteritems () member function, and instead dict.items () function in the Python3. "Key = operator.itemgetter (1)" means that the specified function for the second dimension dictionary sort the elements, attention needed here before you import the symbol library operator. Here is the number of times by the value of each type of label that appears before recording the lowest K distances judgment attributable to the test sample.

4, the test

Here is the complete KNN test code:

# Coding: utf-8
from numpy import *
import operator


# Create training samples ==================== ====================
def createdataset ():
    group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
    labels = [ 'A', 'B', 'C', 'D']
    return group, labels

# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
    DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
    diffmat = tile (Inx, (DataSetSize, 1)) - Dataset
    SqDiffMat = diffmat ** 2
    SqDistances = SqDiffMat.sum (axis = 1)
    Distance = SqDistances ** 0.5
    SortedDistanceIndicies = Distance.argsort ()
    ClassCount = {}
    for i in range (k):
        VoteiLabel = labels [SortedDistanceIndicies [i]]
        ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
    SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
    return SortedClassCount [0] [0]

Groups, Labels = createdataset ()
Result = classify ([0, 0], Groups, Labels, 1)
print (Result)
Run the code, the program promised result "C". It should mention that is for a single training sample (each class has only one training sample) the classification, K KNN value should be set to 1.
     
         
         
         
  More:      
 
- See Shell Script Linux Server network traffic (Server)
- mysqldump issue a note (Database)
- MySQL server after an unexpected power outage can not start (Database)
- Introduction to thread pooling and simple implementation (Programming)
- RegExp object implements regular match --JavaScript (Programming)
- Ubuntu server 8.04 Firewall Guide (Linux)
- Security Knowledge: How to hide a backdoor PHP file tips (Linux)
- Linux file system management partition, format, mount - label mount (Linux)
- Analysis RabbitMQ cluster (Server)
- How to install with JSON support in Ubuntu 15.04 SQLite 3.9.1 (Database)
- Detailed use Zabbix monitoring Nginx (Server)
- RM Environment Database RMAN Backup Strategy Formulation (Database)
- Nginx server security configuration (Server)
- Taught you how to build your own VPS server (Server)
- Talking about the shortcomings and deficiencies of the firewall (Linux)
- How to use the beta / unstable version of the software in Debian library (Linux)
- Text editing and viewing text Linux command (Linux)
- LAN Deployment Docker-- from scratch to create your own private warehouse Docker (Linux)
- Whisker Menu 1.4.3 Install menu Linux (Linux)
- Linux server startup and logon security settings (Linux)
     
           
     
  CopyRight 2002-2020 newfreesoft.com, All Rights Reserved.