Naive Bayes is among one of the most simple and powerful algorithms for **classification** based on Bayes’ Theorem with an assumption of independence among predictors.

**What is Bayes Theorem?**

In Statistics and probability theory, Bayes’ theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event. It serves as a way to figure out the conditional probability.

Given a Hypothesis

**H**and evidence**E,**Bayes’ Theorem states that the relationship between the probability of Hypothesis before getting the evidence**P(H)**and the probability of the hypothesis after getting the evidence**P(H|E)**is :

```
#Mathmetical formula of bayes theorm
P ( H ∣ E ) = P ( E ∣ H ) P ( H ) / P ( E )
```

**Bayes’ Theorem Example**

Let’s suppose we have a Deck of Cards, we wish to find out the “**Probability of the Card we picked at random to be an Ace given that it is a Face Card**“. So, according to Bayes Theorem, we can solve this problem.

First, we need to find out the probability

**P(Ace)**which is**4/52**as there are 4 Aces in a Deck of Cards.**P(Face|Ace)**is equal to**1**as all the Aces are face Cards.**P(Face)**is equal to**12/52**as there are 3 Face Cards in a Suit of 13 cards and there are 4 Suits in total.

Putting these all values into the Bayes theorem you can find the result of this.

**How to implement it Using python**

Importing all related libraries

```
import numpy as np
from sklearn.naive_bayes import BernoulliNB
```

Creating training data and target column

```
# Some data is created to train with
X = np.array([[0, 1, 0], [0, 1, 1], [1, 1, 0]])
# These are our target values (Classes: good or Not a good)
y = np.array(['good', 'Not a good', 'Not a good'])
```

Now data is ready to fit it into the Naive Bayer Classifier.

`print(X)`

**Output:**

array([[0, 1, 0], [0, 1, 1], [1, 1, 0]])

`print(y)`

**Output:**

array(['good', 'Not a good', 'Not a good'], dtype='<U10')

```
# This is the code we need for the Bernoulli model
model = BernoulliNB()
# We train the model on our data
model.fit(X, y)
```

**Output:**

BernoulliNB(alpha=1.0, binarize=0.0, class_prior=None, fit_prior=True)

**Predict the result:**

```
print("What does our model think this should be?")
print("Answer: %s!" % clf.predict([[0, 0, 0]])[0])
```

**Output:**

What does our model think this should be? Answer: good!