
Use Python to achieve K nearest neighbor classification algorithm (KNN) is already a commonplace problem, there are already a lot of information online, but here I decided to record your own learning experience.
1, the configuration numpy library
numpy Python libraries are thirdparty libraries for matrix operations, most will rely on the math libraries for about numpy library configuration see: Python Configuration tortuous road of thirdparty libraries and matplotlib Numpy, configuration, when completed, will numpy library overall imported into the current project.
2. Preparation of training samples
Here a simple structure with four points and the corresponding label as training samples of KNN:
# Create training samples ==================== ====================
def createdataset ():
group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
labels = [ 'A', 'B', 'C', 'D']
return group, labels
There is a small detail, is through the array () function to initialize the old structure and numpy array objects when you want to ensure that only one parameter, so you need to code parameters in brackets, like this is not a legitimate way to call of:
group = array ([1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1])
3. Create a classification function
Knearest neighbor algorithm for classification is usually classified according to Euclidean distance, data and training input data needs to be reduced further in all dimensions relative to the square sum, before you prescribe, as follows:
# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
diffmat = tile (Inx, (DataSetSize, 1))  Dataset
SqDiffMat = diffmat ** 2
SqDistances = SqDiffMat.sum (axis = 1)
Distance = SqDistances ** 0.5
SortedDistanceIndicies = Distance.argsort ()
ClassCount = {}
Here tile () function is a numpy matrix extension functions, such as training samples in this example there are four twodimensional coordinate point, the input sample (a twodimensional coordinate point), it needs to be extended to a row 1 of 4 matrix, and then performing matrix subtraction, summation law in the flat, and then the square root of the distance calculation. After computation of the distance, calling Matrix object sorting member function argsort () distance in ascending order. Here introduce a Pycharm View source life tips: join in the preparation of this program is that we are not sure argsort () whether the array object member function, we select this function and right > Go to > Declaration, so jumps to argsort () function declaration code sheet, by looking at the code can be confirmed affiliation does include this array class member function calls no problem
After the sort of distance, the next step according to the first K values corresponding to the minimum distance label to determine the current sample belongs to which category:
for i in range (k):
VoteiLabel = labels [SortedDistanceIndicies [i]]
ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
There is a small problem is to obtain in Python2 the dictionary elements using dict.iteritems () member function, and instead dict.items () function in the Python3. "Key = operator.itemgetter (1)" means that the specified function for the second dimension dictionary sort the elements, attention needed here before you import the symbol library operator. Here is the number of times by the value of each type of label that appears before recording the lowest K distances judgment attributable to the test sample.
4, the test
Here is the complete KNN test code:
# Coding: utf8
from numpy import *
import operator
# Create training samples ==================== ====================
def createdataset ():
group = array ([[1.0, 1.1], [1.0, 1.0], [0, 0], [0, 0.1]])
labels = [ 'A', 'B', 'C', 'D']
return group, labels
# ==================== Euclidean distance classifier ====================
def classify (Inx, Dataset, labels, k):
DataSetSize = Dataset.shape [0] # Get the number of rows of data, shape [1] ranked number
diffmat = tile (Inx, (DataSetSize, 1))  Dataset
SqDiffMat = diffmat ** 2
SqDistances = SqDiffMat.sum (axis = 1)
Distance = SqDistances ** 0.5
SortedDistanceIndicies = Distance.argsort ()
ClassCount = {}
for i in range (k):
VoteiLabel = labels [SortedDistanceIndicies [i]]
ClassCount [VoteiLabel] = ClassCount.get (VoteiLabel, 0) + 1
SortedClassCount = sorted (ClassCount.items (), key = operator.itemgetter (1), reverse = True)
return SortedClassCount [0] [0]
Groups, Labels = createdataset ()
Result = classify ([0, 0], Groups, Labels, 1)
print (Result)
Run the code, the program promised result "C". It should mention that is for a single training sample (each class has only one training sample) the classification, K KNN value should be set to 1. 


