High school statistics
- Compound probability of independent events
- Independent events example: test taking
- General multiplication rule example: independent events
- Dependent probability introduction
- General multiplication rule example: dependent events
- Probability with general multiplication rule
- Interpreting general multiplication rule
- Interpret probabilities of compound events
Compound probability of independent events
You'll become familiar with the concept of independent events, or that one event in no way affects what happens in the second event. Keep in mind, too, that the sum of the probabilities of all the possible events should equal 1. Created by Sal Khan.
Want to join the conversation?
- At2:44"But these are independent events. What happens in the first flip in no way affects what happens in the second flip..."2:55"...where someone thinks if I got a bunch of heads in a row ..then all of a sudden becomes likely on the next flip to get a tail. This is not the case"
Since H and T are equally likely events, and if I get more heads in the beginning then don't I have to have a better chance to have a tail (as we approach infinite they must be 50%-50%)? Can you please explain(108 votes)
- The idea behind the law of large numbers is that with big enough numbers, no small divergence from the theoretical probability will make a difference. Let's say you flip a coin, and the first 10 times it come up heads. If you flip the coin another 100 times, then you would expect 50 heads and 50 tails. That means that over the 110 flips (including the first 10) you would have 60 heads, 50 tails, or about a 54/45 split. But lets say you continue flipping another 1000 times. You would expect 500 heads and 500 tails. Then we would have 1110 flips, and of these, 560 (500+50+10) would be heads and 550 would be tails. This is about a 50.4/49.5 split. Notice how the gap got smaller when we added more flips. If we kept adding more and more flips- a million, or a billion- then we would get a lot closer to the 50/50 we would expect. The coin never needs to "catch up" by flipping more tails than heads- it just keeps flipping fairly, and eventually those 10 heads you flipped at the start become insignificant.(34 votes)
- Let me get this straight.
So if I get 5 H (heads) in a row flipping a coin, it's not more likely that my sixth will be a T (tails).
I understand that.
But isn't it also true that the probability of six coins flips resulting in at least one T is very high? I mean, it's unlikely that you'll get 6 H in a row, with no Ts. The probability of the sequence HHHHHH is low.
So I'm thinking that although it is very likely that you will get at least one T in your sequence of 6 flips, it is not necessarily likely that the next flip will have a T.
But say for this purpose that you are playing a ridiculous game where you can win a million dollars for flipping a fair coin 5 times and getting one T. If you don't, you lose a million dollars.
You have flipped 4 times and gotten all Hs. You know that the probability of getting at least one T is very high. Because you haven't gotten any Ts yet, doesn't that mean that you have a very good chance of getting a T on your next flip?
Or is it that the chance of getting a T on your last flip is equal to the chance of getting a T any of your other times?
I'm confused.(28 votes)
- Here's a more detailed answer:
The probability of flipping six heads in a row is very low before you start. This is a single independent event, not six different ones. However, once you have already flipped five heads, those five coins that you just flipped no longer matter. Thus, the probability of flipping a sixth head becomes the same as the probability of flipping a head for a single coin, since that's exactly what your sixth coin is: a single coin, unaffected by the previous five.
So as a summary:
We are comparing two single events: Flipping six heads in a row and flipping one head in a row.(8 votes)
- Okay, here you multiply the events, but when do you add them? What is the distinction between adding and multiplying? Or in other words: When do I use what?(24 votes)
- You can add probabilities of events if they are interchangeable. For instance if you roll a dice you can't get both three and four. Just one or none of them.
On the other hand if you roll it twice these events are independent from each other. So here you've got to multiply the events.(15 votes)
- Hey... I have a very weird question.
What if there was a family who decided to have only 5 children (but they don't have twins or anything) and give different names for boys and girls according to what... uh, 'rank' of child it is? ex. 1st child, 2nd child, etc. All boy names start with a B, and girl names start with G. The total outcomes are 32. I found that after 2 days of writing them down. But how to solve this mathematically? I 5 to 2nd power at first, but as it is 25, i thought 2 to the 5th power, which is 32- but i still didn't find an explanation.(11 votes)
- It is 2^5 - you can think of it as listing the number of choices for each "spot"
1st child - two choices
2nd child - two choices
3rd child - two choices
4th child - two choices
5th child - two choices
Then you multiply all of the different options for each spot and you get 2*2*2*2*2 = 2^5 = 32(11 votes)
- What is the difference between a mutually exclusive and independent event?
- Interesting question! Many students confuse these two concepts.
Events A and B are called mutually exclusive if they cannot both occur, that is, P(A and B) = 0. In this situation, P(A or B) = P(A) + P(B).
Events A and B are called independent if the occurrence of one event has no effect on the probability of the other event occurring. In this situation, P(A and B) = P(A)*P(B).
Example: suppose two dice are rolled. Let A represent the event that the first die is a 1, let B represent the event that the first die is a 6, and let C represent the event that the second die is a 6.
A and B are mutually exclusive because the first die cannot be both a 1 and a 6. Note that A and B are not independent, because knowing that the first die is a 1 would eliminate the possibility that the first die is a 6 (that is, knowing that the first die is a 1 changes the probability that the first die is a 6, from 1/6 to 0).
A and C are independent, because knowing that the first die is a 1 has no effect at all on the probability that the second die is a 6. Note that A and C are not mutually exclusive, because it is possible for the first die to be a 1 and the second die to be a 6 (the probability that these both occur is 1/36, which is not 0).
Have a blessed, wonderful day!(14 votes)
- Can there be a probability like this : 26/24?(2 votes)
- Probabilities always exists in the range between 0 and 1, 26/24 is over 1, so it cannot represent a probability.(12 votes)
- Why did Sal multiplied the probabilities of getting heads in each of the coins in order to find the probability of getting heads in the both coins simultaneously ?(5 votes)
- he gave all of them because he wanted to find all of the possible outcomes not just some of them(3 votes)
- If you flip a coin 17 times, what is the probability that the number of heads flipped will be the same as the number of tails flipped?(2 votes)
- Or the 17th flip landed on the little space between the heads or tails.
Which, is nearly IMPOSSIBLE.(1 vote)
- I still feel a little unsure on this, the maths makes sense but I am not sure on the understanding in case I need to apply it to a different question? Any explanations?(3 votes)
- Regarding gamblers fallacy, I have hard time understanding as to why it doesn't work.
Let's say you flip an unbiased coin 5 times, and all 5 times it was tails.
The probability tells you, since this is an independent event, the next time you flip a coin, it will still be 50% that you will get heads and 50% that you will get tails. If, however, you consider it as a compound event, there's 1/(2^6), about 1.5% that you will get 6 heads or tails in a row.
I wrote a small program to simulate coin flipping and count the number of repeats in a row:
And here's an example of how many times a head or a tail will repeat in a row if you flip a coin 50,000 times:
1: 12590 ; 2 : 6310 ; 3 : 3112 ; 4 : 1573 ; 5 : 796 ; 6 : 357 ; 7 : 183 ;
8 : 93 ; 9 : 49 ; 10 : 20 ; 11 : 12 ; 12 : 10 ; 13 : 4 ; 14 : 1 ; 15 : 2
The above numbers clearly follows the 50,000*1/(2^(n+1)) pattern, where n is the repetition number.
Generalizing, regardless of a sample size 1/(2^(n+1)) is the percentage of likeliness of certain number repeating.
So, if gamblers fallacy is true, how this phenomena with getting same events in a sequence should be explained?(4 votes)
- Let F be the event that the first five flips are heads, and S be the event that the first six flips are heads. As you note, P(F) = 1/2^5 ~ 3% and P(S) = 1/2^6 ~ 1.5%. It is more improbable to get six heads in a row than to get five heads in a row.
However, the gambler's fallacy assumes that this holds true in all situations. If you have already flipped a coin five times, and seen five heads, then there is a 50% chance that the next flip will be heads (thus giving you six heads in a row). Using notation, P(S | F) = 1/2.
I think your program is flipping a coin until the first tail, and then reporting the number of heads seen prior to the first tail. Thus for a number to be reported under "5", we must have five consecutive heads followed by one tail. For a number to be reported under "6", we must have six consecutive heads followed by one tail. Thus the pattern follows 1/2^(n+1) rather than 1/2^n.(1 vote)
Let's think about the situation where we have a completely fair coin here. So let me draw it. I'll assume it's a quarter or something. Let's see. So this is a quarter. Let me draw my best attempt at a profile of George Washington. Well, that'll do. So it's a fair coin. And we're going to flip it a bunch of times and figure out the different probabilities. So let's start with a straightforward one. Let's just flip it once. So with one flip of the coin, what's the probability of getting heads? Well, there's two equally likely possibilities. And the one with heads is one of those two equally likely possibilities, so there's a 1/2 chance. Same thing if we were to ask what is the probability of getting tails? There are two equally likely possibilities, and one of those gives us tails, so 1/2. And this is one thing to realize. If you take the probabilities of heads plus the probabilities of tails, you get 1/2 plus 1/2, which is 1. And this is generally true. The sum of the probabilities of all of the possible events should be equal to 1. And that makes sense, because you're adding up all of these fractions, and the numerator will then add up to all of the possible events. The denominator is always all the possible events. So you have all the possible events over all the possible events when you add all of these things up. Now let's take it up a notch. Let's figure out the probability of-- I'm going to take this coin, and I'm going to flip it twice-- the probability of getting heads and then getting another heads. There's two ways to think about it. One way is to just think about all of the different possibilities. I could get a head on the first flip and a head on the second flip, head on the first flip, tail on the second flip. I could get tails on the first flip, heads on the second flip. Or I could get tails on both flips. So there's four distinct, equally likely possibilities. And one way to think about is on the first flip, I have two possibilities. On the second flip, I have another two possibilities. I could have heads or tails, heads or tails. And so I have four possibilities. For each of these possibilities, for each of these two, I have two possibilities here. So either way, I have four equally likely possibilities. And how many of those meet our constraints? Well, we have it right over here, this one right over here-- having two heads meets our constraints. And there's only one of those possibilities. I've only circled one of the four scenarios. So there's a 1/4 chance of that happening. Another way you could think about this-- and this is because these are independent events. And this is a very important idea to understand in probability, and we'll also study scenarios that are not independent. But these are independent events. What happens in the first flip in no way affects what happens in the second flip. And this is actually one thing that many people don't realize. There's something called the gambler's fallacy, where someone thinks if I got a bunch of heads in a row, then all of a sudden, it becomes more likely on the next flip to get a tails. That is not the case. Every flip is an independent event. What happened in the past in these flips does not affect the probabilities going forward. The fact you got a heads on the first flip in no way affects that you got a heads on the second flip. So if you can make that assumption, you could say that the probability of getting heads and heads, or heads and then heads, is going to be the same thing as the probability of getting heads on the first flip times the probability of getting heads on the second flip. And we know the probability of getting heads on the first flip is 1/2 and the probability of getting heads on the second flip is 1/2. And so we have 1/2 times 1/2, which is equal to 1/4, which is exactly what we got when we tried out all of the different scenarios, all of the equally likely possibilities. Let's take it up another notch. Let's figure out the probability-- and we've kind of been ignoring tails, so let's pay some attention to tails. The probability of getting tails and then heads and then tails-- so this exact series of events. So I'm not saying in any order two tails and a head. I'm saying this exact order-- the first flip is a tails, second flip is a heads, and then third flip is a tail. So once again, these are all independent events. The fact that I got tails on the first flip in no way affects the probability of getting a heads on the second flip. And that in no way affects the probability of getting a tails on the third flip. So because these are independent events, we could say that's the same thing as the probability of getting tails on the first flip times the probability of getting heads on the second flip times the probability of getting tails on the third flip. And we know these are all independent events, so this right over here is 1/2 times 1/2 times 1/2. 1/2 times 1/2 is 1/4. 1/4 times 1/2 is equal to 1/8, so this is equal to 1/8. And we can verify it. Let's try out all of the different scenarios again. So you could get heads, heads, heads. You could get heads, heads, tails. You could get heads, tails, heads. You could get heads, tails, tails. You can get tails, heads, heads. This is a little tricky sometimes. You want to make sure you're being exhaustive in all of the different possibilities here. You could get tails, heads, tails. You could get tails, tails, heads. Or you could get tails, tails, tails. And what we see here is that we got exactly eight equally likely possibilities. We have eight equally likely possibilities. And the tail, heads, tails is exactly one of them. It is this possibility right over here. So it is 1 of 8 equally likely possibilities.