Based on the idea of probability of dependent events:
Bayes Theorem
Naive Bayes Model Concept
Suppose there exists input features (x1, x2, x3, …) and output feature(y)
Probability between these I/P and Output (Dependent) features,
We try to solve probability for events (x1,x2,x3) [Cancel Constant as they even out] for Binary / Multi Class Classification
Example (Naive Bayes):
Input (Outlook, Temperature) -> Output(Play) for Binary Classification (Yes/No)
We make probabilities for each cases of Input(Outlooks, Temperatures) and O/P (Play)
To find Probability for: ( Outlook = "Sunny" & Temperature = "Hot" )
Conclusion: