diff --git a/ipa-core/src/protocol/dp/README.md b/ipa-core/src/protocol/dp/README.md index d34396bf5..c5d0cb81a 100644 --- a/ipa-core/src/protocol/dp/README.md +++ b/ipa-core/src/protocol/dp/README.md @@ -8,10 +8,9 @@ introduces Binomials for DP. considers their use for d-dimension queries (such as we will need for WALR). -To achieve a desired $(\varepsilon, \delta)$-DP guarantee, we generate $num_bernoulli$ secret shared samples of a +To achieve a desired $(\varepsilon, \delta)$-DP guarantee, we generate $num\\_bernoulli$ secret shared samples of a Bernoulli having probability $0.5$ using PRSS. Next we aggregate them to get a Binomial sample. The result of the 2018 -paper above is that for small epsilon (TODO, how small required?), we require the following number of samples -$$ num_bernoulli \geq 8 \log(2/\delta) /\varepsilon^2$$ +paper above is that for small epsilon. This [spreadsheet](https://docs.google.com/spreadsheets/d/1sMgqkMw3-yNBp6f8ctyv4Hdfx9Ei7muj0ZhP9i1DHrw/edit#gid=0) looks at the calculation for different parameter choices and confirms that this approach does lead to a better final @@ -22,4 +21,4 @@ Gaussians but the 2018 paper's analysis shows this). ## Binomial Noise for d-dimensional queries For WALR we will be adding noise to a d-dimensional query, to add binomial noise we have to look -simulatenously at the sensitivity under three different norms, $\ell_1, \ell_2, \ell_\infty$. TODO. \ No newline at end of file +simulatenously at the sensitivity under three different norms, $\ell_1, \ell_2, \ell_\infty$. TODO.