Share this:

How Mathematically Naive Bayes Classifier Works?

Previously, we wrote blogs on many machine learning algorithms as well as many other topics to help you broaden your knowledge. Please kindly visit our site and we would be glad if we got some feedback from you to improve our writing. To see some of them, you can follow the mentioned links.

Here, in this blog, we are going to discuss the mathematics behind Naivebayes, taking one example problem.

Introduction to NaiveBayes

NaiveBayes is a simple classification algorithm in the machine learning sense. It works on the basis of probability. It calculates the poster's probability using some prior knowledge. It is assumed that each feature used is independent of the other. It is used for multi-class classification problems. When the probability of a particular class is high, then it confirms that features belong in that class. Let's see one example to be clear how Naive Bayes worked Mathematically.

[Q.N] Predict class level of the tuple: X = (age = youth, income = medium, student = yes, credit_rating = fair) using Bayesian classification.

Age income Student Credit_rating Bayes_computer
youth high no fair no
youth high no excellent no
middle_age high no fair yes
senior medium no fair yes
senior low yes fair yes
senior low yes excellent yes
youth medium no fair no
youth low yes fair yes
senior medium yes fair yes
youth medium yes excellent yes
middle_age medium no excellent yes
middle_age high yes fair yes
senior medium no excellent no

Solution:

Step I

Prior probability of each class can be computed based on the training tuples

P(Buy-computer => yes) = Total no of yes/Total class = 9/14 = 0.643

P(Buy-computer => no) = Total no of no/Total class = 5/14 = 0.357

Step II

Computation of conditional probability,

P(Age => youth/Buy computer => yes) = Total no of youth out of yes/Total yes = 2/9 = 0.222

P(Age => youth/Buy-computer => no) = Total no of youth out of no/Total no = 3/5 = 0.6

P(Income => medium/Buy computer => yes) = Total no of medium out of yes/Total yes = 4/9 = 0.444

P(income => medium/Buy computer => no) = Total no of medium out of no/Total no = 2/5 = 0.4

P(student => yes/Buy computer => yes) = Total no of yes out of yes/Total yes = 6/9 = 0.667

P(Student => yes/Buy computer => no) = Total no of yes out of no/Total no = 1/5 = 0.2

P(Credit rating => fair/Buy computer => yes) = Total no of fair out of yes/Total-yes = 6/9 = 0.667

P(Credit-rating => fair}{Buy computer => no) = Total no of yes out of noTtotal no = 2/5 = 0.4

Step III

Using above probability let's calculate conditional probability

P(features/Buy computer yes) = P(Age => yes/Buy computer => yes) P(Income => medium}{Buy computer => yes) P(Student => yes/Buy computer => yes) * P(Credit rating => fair/Buy computer => yes) = 0.222×0.444×0.667×0.667 = 0.044

P(features/Buy computer no) = P(Age => yes/Buy computer => no) P(Income => medium/Buy computer => no) P(Student => yes/Buy computer => no) P(Credit rating => fair/Buy computer => no) = 0.60.40.20.4 = 0.019

Step IV

Computation of Posterior Probability

P(Buy computer => yes/feature) = P(features/Buy computer yes) P(Buy computer => yes)= 0.0440.643 = 0.028

P(buy computer => no/feature) = P(features/Buy computer yes) P(Buy computer => no) = 0.0190.357 = 0.007

Conclusion:

Since,

P(Buy computer => yes/feature) > P(Buy computer => no/feature)

Hence, our final conclusion is classification for given feature is Buy_computer is yes.

Leave a Reply

Share this:

Subscribe to our Newsletter

Hello surfer, thank you for being here. We are as excited as you are to share what we know about data. Please subscribe to our newsletter for weekly data blogs and many more. If you’ve already done it, please close this popup.



No, thank you. I do not want.
100% secure.
Scroll to Top