K Nearest Neighbours in Machine Learning

K Nearest Neighbours in Machine Learning

Introduction

K Nearest Neighbours is a supervised learning algorithm. It works for classification and regression. It uses distance between data points to decide outputs.

KNN هو algorithm ف supervised learning. كيخدم ف classification و regression. كيعتمد على distance بين النقاط.

Core Concepts Explained

KNN compares the input with stored data. It looks for neighbours and uses them to decide the final prediction.

KNN كيشوف أقرب نقاط و كيستعملهم باش يعطي prediction.

How KNN Works

  • You choose a value K
  • You compute distance to all training points
  • You select the K closest points
  • You decide the output from these neighbours

KNN for Classification

For classification, KNN counts neighbour classes. The class with the highest count becomes the result.

Example

  • K = 5. Three neighbours are class A. Two neighbours are class B. Final class is A.

KNN for Regression

For regression, KNN averages neighbour values. The result becomes a numeric output.

Example

  • K = 3. Neighbour values = 5, 7, 9. Output = (5 + 7 + 9) / 3.

Distance Metrics

  • Euclidean distance
  • Manhattan distance
  • Minkowski distance

Choosing K

K controls prediction behavior. A low K follows noise. A high K smooths decisions. Test different K values to find balance.

Strengths of KNN

  • Easy to understand
  • No training phase
  • Useful with small datasets

Limitations of KNN

  • Slow with large datasets
  • Weak in high dimensions
  • Affected by feature scale

Improving KNN

  • Scale features
  • Use dimensionality reduction
  • Tune K with validation

Syntax or Model Structure Example

Below is a Python example using scikit-learn.

from sklearn.neighbors import KNeighborsClassifier
import pandas as pd

data = pd.read_csv("data.csv")
X = data[["f1", "f2"]]
y = data["label"]

model = KNeighborsClassifier(n_neighbors=5)
model.fit(X, y)

print(model.predict([[3.4, 1.2]]))

هادا مثال بسيط كيبين كيفاش نخدمو KNN ف sklearn.

KNN in Moroccan Darija

KNN algorithm كيحسب المسافة بين point جديدة و points ف training. K هو عدد neighbours لي غادي نعتمدو عليهم.

KNN Classification

Ila bghiti class, كتشوف neighbours و كتاخد ال class لي غالب.

KNN Regression

Ila bghiti رقم, كتاخد moyenne ديال القيم ديال neighbours.

Nqat Sari7a

  • K صغير كيعطي noise
  • K كبير كيقدم smoothing
  • Scaling كيعاون بزاف

Multiple Practical Examples

1. Classification with KNN

clf = KNeighborsClassifier(n_neighbors=3)
clf.fit(X, y)
print(clf.predict([[2.1, 6.4]]))

2. Regression with KNN

from sklearn.neighbors import KNeighborsRegressor

reg = KNeighborsRegressor(n_neighbors=4)
reg.fit(X, y)
print(reg.predict([[4.0, 2.3]]))

Explanation of Each Example

The first example returns a class. The second example returns a number. The workflow stays the same: distance, neighbour selection, decision.

ف الأول كيرجع class. ف الثاني كيرجع رقم.

Exercises

  • Explain KNN in one sentence.
  • Write a Python script that trains a KNN classifier.
  • Test K values from 1 to 10 and compare accuracy.
  • Compute Euclidean distance between two vectors.
  • Train a KNN regressor on a small dataset.
  • Scale features using MinMaxScaler.
  • List two strengths of KNN.
  • List two limitations of KNN.
  • Create a plot showing accuracy vs K.
  • Explain why KNN needs scaling.

Conclusion

KNN predicts outputs by checking neighbour distance. It supports classification and regression. Good scaling and proper K selection improve its results.

KNN كيستعمل distance باش يعطي prediction. اختيار K مهم بزاف باش يبان الأداء الصحيح.

Share:

Ai With Darija

Discover expert tutorials, guides, and projects in machine learning, deep learning, AI, and large language models . start learning to boot your carrer growth in IT تعرّف على دروس وتوتوريالات ، ومشاريع فـ الماشين ليرنين، الديب ليرنين، الذكاء الاصطناعي، والنماذج اللغوية الكبيرة. بّدا التعلّم باش تزيد تقدم فـ المسار ديالك فـ مجال المعلومات.

Blog Archive