Law of total Probability, Sum rule of probability p ( A , B ) = ∑ C , D p ( A , B , C , D ) p(A,B) = \sum_{C,D}p(A,B,C,D) p ( A , B ) = ∑ C , D p ( A , B , C , D ) Marginal probability from conditional probability means considering all cases by integral
P ( A ) = ∫ P ( A , B ) d B = ∫ P ( A ∣ B ) P ( B ) d B P(A) = \int P(A, B) \, dB = \int P(A \mid B) P(B) \, dB P ( A ) = ∫ P ( A , B ) d B = ∫ P ( A ∣ B ) P ( B ) d B P ( A ) = ∑ P ( A , B i ) = ∑ P ( A ∣ B i ) P ( B i ) P(A) = \sum{P(A, B_i)} = \sum{P(A|B_i)}P(B_i) P ( A ) = ∑ P ( A , B i ) = ∑ P ( A ∣ B i ) P ( B i ) Product rule which involves Bayes Theorem P ( X , Y ) = P ( X ) P ( Y ∣ X ) = P ( Y ) P ( X ∣ Y ) = P ( X ∩ Y ) P(X, Y) = P(X)P(Y|X) = P(Y)P(X|Y) = P(X \cap Y) P ( X , Y ) = P ( X ) P ( Y ∣ X ) = P ( Y ) P ( X ∣ Y ) = P ( X ∩ Y ) Joint probability of X and Y is simply understood as choosing X first and choose Y given X. Confusing part is that regardless of order, reverse probability is expressed as same form.
After that we can decide
Pairwise Independence or not
P ( X ) = P ( X ∣ Y ) , P ( Y ) = P ( Y ∣ X ) P(X) = P(X|Y), P(Y) = P(Y|X) P ( X ) = P ( X ∣ Y ) , P ( Y ) = P ( Y ∣ X )