Intuition:
Consider the following training set
Red marks represent 0 grade and green marks 1 grade.
Treat the white point as a query point (the point for which the class label should be predicted)
If we pass the above dataset to the kNNbased classifier, then the classifier will declare the query point to be of class 0. But it is clear from the graph that the point is closer to the points of class 1 compared to with the class. 0 points To overcome this disadvantage, a weighted kNN was used. In a weighted kNN, the nearest k points are weighted using a function called a kernel function. The intuition behind the weighted kNN is to give more weight to points that are nearby and less weight to points that are farther away. Any function can be used as a kernel function for the weight classifier knn, the value of which decreases with increasing distance. The simple function that is used is the inverse function of distance.
Algorithm :
 Let L = {(x _{ i , y i ), i = 1 ,. ,,, n} be the training set of observations x i with the given class y i, and let x be the new observation (query point) whose class label y is to be predicted. }
 Calculate d (x _{ i }, x) for i = 1 ,. ,,, n — distance between the query point and any other point in the training set.
 Select D & # 39; ⊆ D, the set of k closest training data points to the query points
 Predict the class of the query point, using distanceweighted voting. V represents class labels. Use the following formula
Implementation:
Consider 0 as a label for class 0 and 1 as a label for class 1. Below is the implementation of the weightedkNN algorithm.

python3

Exit:
The value classified to query point is: 1
Time complexity : O (N), where N — the number of points in the training set.