If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

# Conditional probability explained visually

## Want to join the conversation?

• • It is because when we do "balancing" it is making it so when you convert it to fractions they have a common denominator for example, if there are two possible outcomes and one happens 1/4 of the time and one happens 1/2 of the time and I "balance" the branches by making them both have 4 different outcomes I'm actually making 1/2 into 2/4 which makes it easier to calculate, I hope that makes sense! :D
• I see no flaw.

I think that this video should have gone a step further in the building of the leaves. Essential to understand that the leaves do not become "probabilities", rather counts of events.

In initially building a tree diagram, we use information given, the scenario and factors that may influence probabilities. In the case of the first scenario we are asked: P(Fair| Heads):
Fair coin, P(Tails) = 1/2, P(Heads) = 1/2,
Biased coin: P(T) = 0, P(Heads) = 1 = 1/1.
The LCM of these probability ratios is 2. So re-write:

Fair coin, P(Tails) = 1/2, P(Heads) = 1/2,
Biased coin: P(T) = 0, P(Heads) = 2/2.
The numerator becomes our count of leaves. So P(Fair|Heads) = Count of Fair Heads/Count of Heads = 1/(1 +2) = 1/3.

Weighed Bias Scenario:
Two Fair, One Coins Biased.
Fair1 coin, P(Tails) = 1/2, P(Heads) = 1/2,
Fair2 coin, P(Tails) = 1/2, P(Heads) = 1/2,
Biased coin, P(Tails) = 1/3, P(Heads) = 2/3 - Sum of bias probabilies = 1

LCM of 2 and 3 is six. So re-write:

Fair1 coin, P(Tails) = 3/6, P(Heads) = 3/6,
Fair2 coin, P(Tails) = 3/6, P(Heads) = 3/6,
Biased coin, P(Tails) = 2/6, P(Heads) = 4/6

Numerator is your count of leaves:

P(Biased | Heads) = Count of Biased Head/Counts of all heads: 4/(3 + 3+ 4) = 4/10. (Again we disregarded Tails branches and leaves)

This principle can be applied to all sorts of scenarios:
Three coins, all biased
Coin1: P(tails) = 1/6, P(H) = 5/6
Coin2: P(tails) = 2/5, P(H) = 3/5
Coin3: P(tails) = 2/3, P(H) = 1/3

What's the LCM of 6, 5, 3? 30. It's the smallest number that is divible by 3,5,6. • This video was very helpful. Is there another video that works through the same problems using Bayes theorem? • Instead of doing what is done at (making more branches by taking lcm)

Add all favorable outcomes as 2/3 H+ H+H= 8/3 H

Then I find biased outcomes, which is 2/3 H

Then I divide biased outcome by total favorable outcomes.

Which is,

2/3 divided by 8/3

then I get the probability as 1/4 or 25%

Can someone please tell me what is flawed with this process of solving the question?

Thanks! • If you think about what he's doing at he's adjusting by LCM to keep the proportions easy to divide up later, once you cut the branches related to tails.

You don't have to do this, but you've slipped up in the calculation- you're right it's 2/3 on top, but once you cut the tails (.5, .5 and 1/3) you're left with 5/3- if you divide 2/3 by 5/3 you get the same answer as the method in the video. However, it's clearly easier to make a mistake doing it this way!
• At seconds, why the biased coin leads to three equally likely possibility? • I think the video has potential, but I just can't get the same answer using the formula. It is imperative to explain how the formula is relevant here otherwise the video is more confusing than helpful. which is a shame as most of the videos are really good. Can somebody please post the answer with the formula? • In this example , I'm thinking this way-
Please guide me why is this wrong? • Because each of the outcomes in the unbiased branches are worth 1/2 each while the outcomes in the biased branch are worth 1/3 each. (This is why the video talks about "balancing" them out, which makes all the outcomes worth 1/6 each.)

It might become more obvious what is happening if you try to add the probabilities of all the outcomes up (which should total to 1 or 100%).
1/3 (fair) * (1/2 (H) + 1/2 (T)) +
1/3 (fair) * (1/2 (H) + 1/2 (T)) +
1/3 (biased) * (1/3 (H) + 1/3 (H) + 1/3 (T)) = 1

So the chances of getting heads is:
P(H) =
1/3 (fair) * (1/2 (H)) +
1/3 (fair) * (1/2 (H)) +
1/3 (biased) * (1/3 (H) + 1/3 (H))
= 1/6 + 1/6 + 2 /9
= 3/18 + 3/18 + 4/18
= 10/18

The chances of getting the biased coin and heads is:
P(H&B) =1/3 (biased) * (1/3 (H) + 1/3 (H))
= 2 /9
= 4/18

So, the probability that we got the biased coin given that we got heads is:
P(B|H) = P(H&B) / P(H)
P(B|H) = (4/18) / (10/18 ) = 4/10

Hope this makes sense
• What is the relation with (conditional_probability) it is all about dependency . Getting HH the_second_event dependent on whatever you picked the_first_event and it is also related to the idea of equally_likely_or_not_equally_likely_outcomes , what is the point of conditional_probability here? • Instead of doing what is done at (making more branches by taking lcm)

all favorable outcome is 2H of biased coins
total number of outcomes is 4H

then by this way
P(B|H) = 2/4 =50%

Can someone explain why is that.  