Elbow method for optimal k-value in KMeans

The basic step for any unsupervised algorithm is to determine the optimal number of clusters into which the data can be clustered. The elbow method is one of the most popular methods for determining this optimal k-value.

We will now demonstrate this method using the K-Means clustering technique using the Python Sklearn .

Step 1: Import required libraries

from sklearn.cluster import KMeans

from sklearn import metrics

from scipy.spatial.distance import cdist

import numpy as np

import matplotlib.pyplot as plt 

Step 2: Create and visualize data

# Data creation

x1 = np.array ([ 3 , 1 , 1 , 2 , 1 , 6 , 6 , 6 , 5 , 6 7 , 8 , 9 , 8 , 9 , 9 , 8 ])

x2 = np.array ([ 5 , 4 , 5 , 6 , 5 , 8 , 6 , 7 , 6 , 7 , 1 , 2 , 1 , 2 , 3 , 2 , 3 ])

X = np.array ( list ( zip (x1, x2))). reshape ( len (x1), 2 )

  
# Data visualization
plt.plot ()

plt.xlim ([ 0 , 10 ])

plt.ylim ([ 0 , 10 ])

plt.title ( `Dataset` )

plt.scatter (x1, x2)
plt.show ( )

From the above visualization, we can see that the optimal number of clusters should be about 3. But data visualization alone may not always provide the correct answer. From here, we demonstrate the following steps.

We now define the following:

  1. Distortion: is calculated as the mean square of the distance from the cluster centers of the respective clusters. Typically, the Euclidean distance metric is used.
  2. Inertia: is the sum of the squared distances of the samples to their closest cluster center.

We iterate over the k values from 1 to 9 and calculate the distortion values ​​for each k-value and calculate the distortion and inertia for each k-value in the given range.

Step 3: Build the clustering model and calculate the distortion and inertia values ​​

distortions = []

inertias = []

mapping1 = {}

mapping2 = {}

K = range ( 1 , 10 )

 

for k in K:

# Build and fit the model

kmeanModel = KMeans (n_clusters = k) .fit (X)

kmeanModel.fit (X) 

 

distortions.append ( sum (np. min (cdist (X, kmeanModel.cluster_centers_,

`euclidean` ), axis = 1 )) / X.shape [ 0 ])

inertias.append (kmeanModel.inertia_)

 

mapping1 [k] = sum (np. min (cdist (X, kmeanModel.cluster_centers_,

`euclidean` ), axis = 1 )) / X.shape [ 0 ]

mapping2 [k] = kmeanModel.inertia_

Step 4 : Tabulating and visualizing results

a) Using different distortion values ​​

for key, val in mapping1.items ():

print ( str (key) + `: ` + str (val))

< / tr>

plt.plot (K, distortions, `bx-` )

plt.xlabel ( `Values ​​of K` )

plt.ylabel ( `Distortion` )

plt.title ( ` The Elbow Method using Distortion` )

plt.show ()

b) Using different inertia values ​​

for key, val in mapping2.items ():

print ( str (key) + `:` + str (val))

plt.plot (K, inertias, `bx-` )

plt.xlabel ( ` Values ​​of K` )

plt.ylabel ( `Inertia` )

plt.title ( `The Elbow Method using Inertia` )

plt.show ()

To determine the optimal number of clusters, we must select the k value in the “knee”, then is at the point after which distortion / inertia begins to decrease linearly. So for the given data, we conclude that the optimal number of clusters for the data is 3 .

The clustered data points to a different k value: —

1. k = 1

2. k = 2

3. k = 3

4. k = 4