Skip to content

What is the Naive Bayes Algorithm?

The Naive Bayes Algorithm is a classification method based on the so-called Bayes Theorem. In essence, it assumes that the occurrence of a feature is completely uncorrelated with the occurrence of another feature within the class.

The algorithm is naive because it considers the features completely independent of each other and all contribute to the probability of the class. A simple example of this: A car is characterized by having four wheels, being about 4-5 meters long, and being able to drive. All three of these features independently contribute to this object being a car.

How the Algorithm works

The Naive Bayes algorithm is based on the Bayes theorem. It describes a formula for calculating the conditional probability P(A|B) or in words: What is the probability that event A occurs when event B has occurred. As an example: What is the probability that I have Corona (= event A) if my rapid test is positive (= event B).

According to Bayes, this conditional probability can be calculated using the following formula:

\(\) [P(A|B) = frac{P(B|A) * P(A)}{P(B)} \]

  • P(B|A) = probability that event B occurs if event A has already occurred
  • P(A) = probability that event A occurs
  • P(B) = probability that event B occurs

Why should we use this formula? Let us return to our example with the positive test and the Corona disease. I cannot know the conditional probability P(A|B) and can only find it out via an elaborate experiment. The inverse probability P(B|A), on the other hand, is easier to find out. In words it means: How likely is it that a person suffering from Corona has a positive rapid test.

This probability can be found out relatively easily by having demonstrably ill persons perform a rapid test and then calculating the ratio of how many of the tests were actually positive. The probabilities P(A) and P(B) are similarly easy to find out. The formula then makes it easy to calculate the conditional probability P(A|B).

If we have only one feature, this already explains the complete Naive Bayes algorithm. With a feature for the conditional probability P(x | K) for different classes is calculated and the class with the highest probability wins. For our example, this means that the identical conditional probabilities P(the person is sick | test is positive) and P(the person is healthy | test is negative) are calculated using Bayes’ theorem and the classification is done for the class with the higher probability.

If our dataset consists of more than one feature, we proceed similarly and compute the conditional probability for each combination of feature x and class K. We then multiply all probabilities for one feature. The class K that then has the highest product of probabilities is the corresponding class of the dataset.

Advantages and Disadvantages of the Naive-Bayes Algorithm

The Naive Bayes Algorithm is a popular starting point for a classification application since it is very easy and fast to train and can deliver good results in some cases. If the assumption of independence of the individual features is given, it even performs better than comparable classification models, such as logistic regression, and requires fewer data to train.

Although the Naive Bayes Algorithm can achieve good results with only a few data, we need so much data that each class appears at least once in the training data set. Otherwise, the Naive Bayes Classifier will return a probability of 0 as a result for the category in the test dataset. Moreover, in reality, it is very unlikely that all input variables are completely independent of each other, which is also very difficult to test.

This is what you should take with you

  • The Naive Bayes Algorithm is a simple method to classify data.
  • It is based on Bayes’ theorem and is naive because it assumes that all input variables and their expression are independent of each other.
  • The Naive Bayes Algorithm is relatively quick and easy to train, but in many cases, it does not give good results because the assumption of independence of the variables is violated.

Other Articles on the Topic of Naive Bayes

  • Scikit-Learn provides some examples and programming instructions for the Naive Bayes algorithm in Python.
close
Das Logo zeigt einen weißen Hintergrund den Namen "Data Basecamp" mit blauer Schrift. Im rechten unteren Eck wird eine Bergsilhouette in Blau gezeigt.

Don't miss new articles!

We do not send spam! Read everything in our Privacy Policy.

Cookie Consent with Real Cookie Banner