Naive Bayes
Introduction
Naive Bayes is a supervised learning algorithm used for classification. It uses probability to select the most likely class. It applies Bayes rule and assumes feature independence. Next, you see the core logic and simple examples.
Naive Bayes هو algorithm ديال classification. كيعتمد على probabilities. كيطبق Bayes rule و كيعتبر features مستقلة.
Core Concepts Explained
For each class,Naive Bayes computes a probability score. It picks the class with the highest score. This makes the model fast and simple.
Naive Bayes كيدير حساب probability ديال كل class و كيختار أعلى score.
Bayes Rule
Bayes rule links prior class probability with the likelihood of features. It gives a clear formula for scoring each class.
How Naive Bayes Works
- Compute prior probability for each class
- Compute likelihood for each feature
- Apply Bayes rule
- Select the class with the strongest probability
Types of Naive Bayes
Gaussian Naive Bayes
Used when features follow a normal distribution.
Multinomial Naive Bayes
Used for text classification and count based features.
Bernoulli Naive Bayes
Used when features take binary values.
Use Cases
- Spam filtering
- Sentiment analysis
- Document classification
- Simple recommendation tasks
Strengths of Naive Bayes
- Fast training
- Low memory usage
- Strong for text tasks
Limitations
- Independence assumption reduces accuracy in some cases
- Weak with strong feature interaction
Improving Naive Bayes
- Apply feature selection
- Clean text before training
- Use smoothing
Syntax or Model Structure Example
This example shows a simple Naive Bayes classifier in Python.
from sklearn.naive_bayes import MultinomialNB
from sklearn.feature_extraction.text import CountVectorizer
texts = ["good product", "bad quality", "excellent item"]
labels = [1, 0, 1]
vec = CountVectorizer()
X = vec.fit_transform(texts)
model = MultinomialNB()
model.fit(X, labels)
test = vec.transform(["good quality"])
print(model.predict(test))
هادا مثال بسيط كيشرح خدمة Naive Bayes ف text classification.
Naive Bayes in Moroccan Darija
Naive Bayes algorithm كيبني القرار ديالو على probability. كيحسب prior dial kol class و likelihood dial features و كيجمعهم ب Bayes rule.
Kif Kaykhddam
- Kay7seb prior dial class
- Kay7seb likelihood dial features
- Kayjma3 probabilities
- Kayakhod class لي عندها score عالي
Types
- Gaussian ila features normal
- Multinomial f text
- Bernoulli f binary features
Multiple Practical Examples
1. Gaussian NB Example
from sklearn.naive_bayes import GaussianNB
model = GaussianNB()
model.fit(X, y)
print(model.predict([[5.2, 3.1]]))
2. Bernoulli NB Example
from sklearn.naive_bayes import BernoulliNB
model = BernoulliNB()
model.fit(X, y)
print(model.predict([[1, 0, 1, 0]]))
Explanation of Each Example
The Gaussian model handles numeric distributed features. The Bernoulli model handles binary features.
Gaussian كيتعامل مع قيم رقمية. Bernoulli كيتعامل مع قيم binary.
Exercises
- Explain Naive Bayes in one sentence.
- Write Bayes rule formula in your own words.
- Train a MultinomialNB model on a small dataset.
- Test GaussianNB with numeric features.
- List two strengths of Naive Bayes.
- List two limitations of Naive Bayes.
- Create a vocabulary using CountVectorizer.
- Explain why smoothing is important.
- Test model accuracy with and without text cleaning.
- Create a simple Naive Bayes spam filter.
Internal Linking Suggestions
[internal link: Supervised Learning Guide]
[internal link: Classification Algorithms Overview]
Conclusion
Naive Bayes gives fast and clean classification results. It stays strong for text tasks and simple datasets. Feature cleaning and smoothing help improve performance.
Naive Bayes سريع و واضح ف classification. text cleaning و smoothing كيحسنو الأداء.