Support Vector Machine in Machine Learning

Support Vector Machine in Machine Learning

Introduction

Support Vector Machine is a supervised learning algorithm. It works for classification and regression. It builds a boundary that separates data points in a clear and stable way.

SVM هو algorithm ف supervised learning. كيخدم ف classification و regression. كيدور على boundary لي كاتفصل data مزيان.

Core Concepts Explained

SVM finds a hyperplane that separates classes with the widest margin. The closest points to this hyperplane are support vectors. These points control the boundary.

SVM كيقلب على hyperplane لي كيكون بعيد على أقرب points. هاد points هما support vectors.

How SVM Works

  • Receive training data
  • Find a hyperplane between classes
  • Maximize margin
  • Classify new points based on the side

Support Vectors

Support vectors are the data points near the margin. The model adjusts the hyperplane around them.

Linear vs Non Linear SVM

Linear SVM

Works when data can be separated with a line or plane.

Non Linear SVM

Works when data has curves. SVM uses kernels to handle complex shapes.

Kernel Trick

The kernel trick maps data to a higher space without computing the full transformation. This helps SVM split complex data.

Popular Kernels

  • Linear kernel
  • Polynomial kernel
  • RBF kernel
  • Sigmoid kernel

SVM for Classification

SVM draws a hyperplane between classes. It assigns a class to new points based on which side they fall on.

SVM for Regression

SVM regression builds a margin tube. It tries to keep prediction errors inside this tube.

Strengths of SVM

  • Stable in high dimensional data
  • Strong when margins are clear
  • Flexible with kernels

Limitations of SVM

  • Slow on large datasets
  • Sensitive to kernel choices
  • Needs feature scaling

Improving SVM

  • Scale features
  • Test multiple kernels
  • Tune C and gamma

Syntax or Model Structure Example

This example shows a simple SVM classifier in Python.

from sklearn.svm import SVC
import pandas as pd

data = pd.read_csv("data.csv")
X = data[["f1", "f2"]]
y = data["label"]

model = SVC(kernel="rbf", C=1.0, gamma="scale")
model.fit(X, y)

print(model.predict([[2.3, 4.1]]))

هادا مثال بسيط كيبين خدمة SVM ف sklearn.

SVM in Moroccan Darija

SVM algorithm كيحاول يلقا boundary لي كاتفصل data ب margin واسع. هاد margin مهم بزاف.

Core Steps

  • Kay7ell data
  • Kaylqa hyperplane
  • Kayssa3 margin
  • Kayclassi حسب الجهة

Support Vectors

Homa points لي qrabin بزاف لل boundary. Homa لي كيوجّهو hyperplane.

Kernels

Ila data ma katslefsh b line, SVM kayst3mel kernels بحال RBF و polynomial باش يقسم data مزيان.

Multiple Practical Examples

1. Linear SVM

model = SVC(kernel="linear")
model.fit(X, y)
print(model.coef_)

2. SVM Regression

from sklearn.svm import SVR

reg = SVR(kernel="rbf")
reg.fit(X, y)
print(reg.predict([[3.5, 1.2]]))

Explanation of Each Example

The first example builds a linear classifier. The second predicts numeric values using SVM regression.

الأول كيصنف. الثاني كيتوقع رقم.

Exercises

  • Explain SVM in one sentence.
  • Define a hyperplane.
  • Train a linear SVM model in Python.
  • Train an RBF kernel model and compare results.
  • List two strengths of SVM.
  • List two limitations of SVM.
  • Test SVM with scaled vs unscaled features.
  • Modify C and observe the effect.
  • Modify gamma and observe the effect.
  • Use SVM regression on a simple dataset.

Conclusion

SVM builds solid boundaries for classification and regression. Kernels help it handle complex shapes. Scaling and tuning remain important for strong results.

SVM كيقدم boundaries قوية و كيصنف data بدقة. tuning و scaling ضروريين باش يكون الأداء واضح.

Share:

Ai With Darija

Discover expert tutorials, guides, and projects in machine learning, deep learning, AI, and large language models . start learning to boot your carrer growth in IT تعرّف على دروس وتوتوريالات ، ومشاريع فـ الماشين ليرنين، الديب ليرنين، الذكاء الاصطناعي، والنماذج اللغوية الكبيرة. بّدا التعلّم باش تزيد تقدم فـ المسار ديالك فـ مجال المعلومات.

Blog Archive