Skip to content

Commit d0d8103

Browse files
Tom's July 16 edits of prob_matrix lecture (#487)
Co-authored-by: thomassargent30 <ts43@nyu.edu>
1 parent d52cdec commit d0d8103

File tree

1 file changed

+25
-11
lines changed

1 file changed

+25
-11
lines changed

lectures/prob_matrix.md

Lines changed: 25 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Among concepts that we'll be studying include
3535

3636
We'll use a matrix to represent a bivariate or multivariate probability distribution and a vector to represent a univariate probability distribution
3737

38-
This {doc}`companion lecture <stats_examples>` describes some popular probability distributions and uses Python to sample from them.
38+
This {doc}`companion lecture <stats_examples>` describes some popular probability distributions and describes how to use Python to sample from them.
3939

4040

4141
In addition to what's in Anaconda, this lecture will need the following libraries:
@@ -430,10 +430,11 @@ $$
430430
431431
$$
432432
\textrm{Prob}\{X=i|Y=j\} =\frac{\textrm{Prob}\{X=i,Y=j\}}{\textrm{Prob}\{Y=j\}}=\frac{\textrm{Prob}\{Y=j|X=i\}\textrm{Prob}\{X=i\}}{\textrm{Prob}\{Y=j\}}
433-
$$
433+
$$ (eq:condprobbayes)
434434
435435
```{note}
436-
This can be interpreted as a version of what a Bayesian calls **Bayes' Law**.
436+
Formula {eq}`eq:condprobbayes` is also what a Bayesian calls **Bayes' Law**. A Bayesian statistician regards marginal probability distribution $\textrm{Prob}({X=i}), i = 1, \ldots, J$ as a **prior** distribution that describes his personal subjective beliefs about $X$.
437+
He then interprets formula {eq}`eq:condprobbayes` as a procedure for constructing a **posterior** distribution that describes how he would revise his subjective beliefs after observing that $Y$ equals $j$.
437438
```
438439
439440
@@ -588,9 +589,15 @@ Marginal distributions are
588589
$$ \textrm{Prob}(X=i)=\sum_j{f_{ij}}=u_i $$
589590
$$ \textrm{Prob}(Y=j)=\sum_i{f_{ij}}=v_j $$
590591

591-
Below we draw some samples confirm that the "sampling" distribution agrees well with the "population" distribution.
592592

593-
**Sample results:**
593+
**Sampling:**
594+
595+
Let's write some Python code that let's us draw some long samples and compute relative frequencies.
596+
597+
The code will let us check whether the "sampling" distribution agrees with the "population" distribution - confirming that
598+
the population distribution correctly tells us the relative frequencies that we should expect in a large sample.
599+
600+
594601

595602
```{code-cell} ipython3
596603
# specify parameters
@@ -615,7 +622,9 @@ x[1, p < f_cum[0]] = ys[0]
615622
print(x)
616623
```
617624

618-
Here, we use exactly the inverse CDF technique to generate sample from the joint distribution $F$.
625+
```{note}
626+
To generate random draws from the joint distribution $F$, we use the inverse CDF technique described in {doc}`this companion lecture <stats_examples>`.
627+
```
619628

620629
```{code-cell} ipython3
621630
# marginal distribution
@@ -715,9 +724,10 @@ x=x_2 & \vdots & \frac{0.1}{0.5}=0.2 & \frac{0.4}{0.5}=0.8 \\
715724
\end{array}\right]
716725
$$
717726

718-
These population objects closely resemble sample counterparts computed above.
727+
These population objects closely resemble the sample counterparts computed above.
719728

720-
Let's wrap some of the functions we have used in a Python class for a general discrete bivariate joint distribution.
729+
Let's wrap some of the functions we have used in a Python class that will let us generate and sample from a
730+
discrete bivariate joint distribution.
721731

722732
```{code-cell} ipython3
723733
class discrete_bijoint:
@@ -951,7 +961,7 @@ ax.set_xticks([])
951961
plt.show()
952962
```
953963

954-
Next we can simulate from a built-in `numpy` function and calculate a **sample** marginal distribution from the sample mean and variance.
964+
Next we can use a built-in `numpy` function to draw random samples, then calculate a **sample** marginal distribution from the sample mean and variance.
955965

956966
```{code-cell} ipython3
957967
μ= np.array([0, 5])
@@ -984,7 +994,7 @@ plt.show()
984994

985995
**Conditional distribution**
986996

987-
The population conditional distribution is
997+
For a bivariate normal population distribution, the conditional distributions are also normal:
988998

989999
$$
9901000
\begin{aligned} \\
@@ -993,6 +1003,10 @@ $$
9931003
\end{aligned}
9941004
$$
9951005

1006+
```{note}
1007+
Please see this {doc}`quantecon lecture <multivariate_normal>` for more details.
1008+
```
1009+
9961010
Let's approximate the joint density by discretizing and mapping the approximating joint density into a matrix.
9971011

9981012
We can compute the discretized marginal density by just using matrix algebra and noting that
@@ -1221,7 +1235,7 @@ But the joint distributions differ.
12211235
Thus, multiple joint distributions $[f_{ij}]$ can have the same marginals.
12221236

12231237
**Remark:**
1224-
- Couplings are important in optimal transport problems and in Markov processes.
1238+
- Couplings are important in optimal transport problems and in Markov processes. Please see this {doc}`lecture about optimal transport <opt_transport>`
12251239

12261240
## Copula Functions
12271241

0 commit comments

Comments
 (0)