Bayes' rule
We use Bayes’ rule when we have a hypothesis and observed some evidence, and we want to get probability of Hypothesis given the evidence. That is a method for calculating posterior probability using prior probability and likelihood. We can induce probability of hypothesis given the evidence by filtering out possibilities fitting the evidence from all possibilities.
It's a formula expressing a Statistical Thinking, one form derived from the definitions of Conditional probability, Joint Probability, and Marginalization.
The car key example well illustrates Bayes' probability because it shows how the probability distribution of a model's parameters changes before and after observation. It is a formula for calculating the degree of updated belief based on prior probability and new evidence, using conditional probability to calculate the posterior probability.
We can chain is Hyperparameter used to determine prior and not is the reason that hyperparameters alpha and beta are used to determine the prior probability distribution, but they do not affect the relationship between the data D and the model parameter theta
Log form
where
Bayes Theorem Notion