Consider the following training set
Red marks represent 0 grade and green marks 1 grade.
Treat the white point as a query point (the point for which the class label should be predicted)
If we pass the above dataset to the kNN-based classifier, then the classifier will declare the query point to be of class 0. But it is clear from the graph that the point is closer to the points of class 1 compared to with the class. 0 points To overcome this disadvantage, a weighted kNN was used. In a weighted kNN, the nearest k points are weighted using a function called a kernel function. The intuition behind the weighted kNN is to give more weight to points that are nearby and less weight to points that are farther away. Any function can be used as a kernel function for the weight classifier knn, the value of which decreases with increasing distance. The simple function that is used is the inverse function of distance.
Consider 0 as a label for class 0 and 1 as a label for class 1. Below is the implementation of the weighted-kNN algorithm.
code > python3
The value classified to query point is: 1
Time complexity : O (N), where N — the number of points in the training set.