Haleshot commited on
Commit
25e577e
·
unverified ·
1 Parent(s): 9ae2eda

refine binomial distribution explanations for clarity

Browse files
probability/14_binomial_distribution.py CHANGED
@@ -13,7 +13,7 @@
13
 
14
  import marimo
15
 
16
- __generated_with = "0.11.24"
17
  app = marimo.App(width="medium", app_title="Binomial Distribution")
18
 
19
 
@@ -25,11 +25,9 @@ def _(mo):
25
 
26
  _This notebook is a computational companion to ["Probability for Computer Scientists"](https://chrispiech.github.io/probabilityForComputerScientists/en/part2/binomial/), by Stanford professor Chris Piech._
27
 
28
- In this section, we will discuss the binomial distribution. To start, imagine the following example:
29
 
30
- Consider $n$ independent trials of an experiment where each trial is a "success" with probability $p$. Let $X$ be the number of successes in $n$ trials.
31
-
32
- This situation is truly common in the natural world, and as such, there has been a lot of research into such phenomena. Random variables like $X$ are called **binomial random variables**. If you can identify that a process fits this description, you can inherit many already proved properties such as the PMF formula, expectation, and variance!
33
  """
34
  )
35
  return
@@ -197,11 +195,11 @@ def _(mo):
197
  r"""
198
  ## Relationship to Bernoulli Random Variables
199
 
200
- One way to think of the binomial is as the sum of $n$ Bernoulli variables. Say that $Y_i$ is an indicator Bernoulli random variable which is 1 if experiment $i$ is a success. Then if $X$ is the total number of successes in $n$ experiments, $X \sim \text{Bin}(n, p)$:
201
 
202
  $$X = \sum_{i=1}^n Y_i$$
203
 
204
- Recall that the outcome of $Y_i$ will be 1 or 0, so one way to think of $X$ is as the sum of those 1s and 0s.
205
  """
206
  )
207
  return
 
13
 
14
  import marimo
15
 
16
+ __generated_with = "0.12.6"
17
  app = marimo.App(width="medium", app_title="Binomial Distribution")
18
 
19
 
 
25
 
26
  _This notebook is a computational companion to ["Probability for Computer Scientists"](https://chrispiech.github.io/probabilityForComputerScientists/en/part2/binomial/), by Stanford professor Chris Piech._
27
 
28
+ The binomial distribution is essentially what happens when you run multiple Bernoulli trials and count the successes. I love this distribution because it appears everywhere in practical scenarios.
29
 
30
+ Think about it: whenever you're counting how many times something happens across multiple attempts, you're likely dealing with a binomial. Website conversions, A/B testing results, even counting heads in multiple coin flips — all binomial!
 
 
31
  """
32
  )
33
  return
 
195
  r"""
196
  ## Relationship to Bernoulli Random Variables
197
 
198
+ One way I like to think about the binomial: it's just adding up a bunch of Bernoullis. If each $Y_i$ is a Bernoulli that tells us if the $i$-th trial succeeded, then:
199
 
200
  $$X = \sum_{i=1}^n Y_i$$
201
 
202
+ This makes the distribution really intuitive to me - we're just counting 1s across our $n$ experiments.
203
  """
204
  )
205
  return