Logo

Based on the idea of probability of dependent events:
Pasted image 20240905081514.png

Bayes Theorem

Pasted image 20240905081604.png

Naive Bayes Model Concept

Suppose there exists input features (x1, x2, x3, …) and output feature(y)
Pasted image 20240905081840.png

Probability between these I/P and Output (Dependent) features,
Pasted image 20240905082239.png

We try to solve probability for events (x1,x2,x3) [Cancel Constant as they even out] for Binary / Multi Class Classification
Pasted image 20240905082327.png

Example (Naive Bayes):

Input (Outlook, Temperature) -> Output(Play) for Binary Classification (Yes/No)

We make probabilities for each cases of Input(Outlooks, Temperatures) and O/P (Play)
Pasted image 20240905082506.png

To find Probability for: ( Outlook = "Sunny" & Temperature = "Hot" )
Pasted image 20240905082743.png

Conclusion:
Pasted image 20240905082822.png

© 2025 All rights reservedBuilt with DataHub Cloud

Built with LogoDataHub Cloud