Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Website #179

Merged
merged 231 commits into from
Oct 18, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
231 commits
Select commit Hold shift + click to select a range
3263e4b
analytic
Nov 23, 2021
9b79de6
docs
Dec 18, 2021
c4ae117
docs
Dec 19, 2021
9f3e49b
docs
Dec 26, 2021
e750b1e
docs
Jan 16, 2022
99d4c04
inference
reubenharry Mar 15, 2022
6478c0a
executable docs
reubenharry Mar 18, 2022
97517a2
more comments, added BayesianModel
reubenharry Mar 21, 2022
4577135
Remove MaybeT and use ExceptT
reubenharry Mar 23, 2022
e9b7158
remove vscode and update gitignore
reubenharry Mar 23, 2022
6c1214a
delete additional changes: Failure
reubenharry Mar 23, 2022
1bbdb2d
delete broken files
reubenharry Mar 23, 2022
003aca4
update test for enumerator
reubenharry Mar 23, 2022
31d3dd1
update models
reubenharry Mar 27, 2022
a48d806
tips on inference
reubenharry Mar 27, 2022
cb100df
density
reubenharry Mar 27, 2022
530f4d9
analytic
reubenharry Mar 29, 2022
4d2a447
analytic
reubenharry Mar 29, 2022
a057100
more pipes
reubenharry Mar 30, 2022
344fb10
update docs
reubenharry Mar 31, 2022
91f968b
linear outliers as in Gen tutorial
reubenharry Apr 1, 2022
0909ea9
update docs
reubenharry Apr 7, 2022
4ef4ccd
cleanup SMC.hs
reubenharry Apr 7, 2022
aa18cfc
more cleanup, and explicit imports
reubenharry Apr 7, 2022
10e464e
remove Rejection
reubenharry Apr 7, 2022
42a8a40
cleanup
reubenharry Apr 8, 2022
753e72a
enumerate
reubenharry Apr 8, 2022
e87ab45
binning
reubenharry Apr 8, 2022
7787b2a
comment
reubenharry Apr 11, 2022
66ec33a
explore
reubenharry Apr 13, 2022
ae590fc
nested inference example
reubenharry Apr 18, 2022
105a63f
found bug in Gamma
reubenharry Apr 25, 2022
a8d9a17
debugging
reubenharry Apr 26, 2022
93b1775
port lazyppl monad into monad-bayes
reubenharry Apr 26, 2022
f9b537e
mh
reubenharry Apr 26, 2022
f63cd79
add conjugacy tests
reubenharry Apr 28, 2022
5ea0f16
working branch
reubenharry Apr 28, 2022
b446c47
tried symbolic
reubenharry Apr 29, 2022
00f087a
no symbolic
reubenharry May 2, 2022
dc065ea
Merge branch 'analytic' into models
reubenharry May 2, 2022
84633be
remove reflection
reubenharry May 2, 2022
8e899cd
Merge branch 'analytic' into models
reubenharry May 2, 2022
24f6343
midway: need to finish hmm and enumerator:
reubenharry May 2, 2022
5bd1837
update docs
reubenharry May 4, 2022
b08e9dd
DP mixture: remove
reubenharry May 9, 2022
5875b43
build docs manually
reubenharry May 9, 2022
656ff50
insegel theme
reubenharry May 9, 2022
e49115e
stanford theme
reubenharry May 9, 2022
e8236f0
lint
reubenharry May 9, 2022
aab90b0
lint
reubenharry May 10, 2022
84dd4f0
ormulo
May 10, 2022
072137a
nix ormolu fixes:
May 10, 2022
3bce561
merge main
reubenharry May 11, 2022
803327b
docs
reubenharry May 12, 2022
01512ec
trying to fix mathjax
reubenharry May 15, 2022
54814c1
Merge branch 'master' into docs
reubenharry May 15, 2022
69173bb
merge docs
reubenharry May 15, 2022
f3f53c6
remove empirical
reubenharry May 15, 2022
ec3fe94
merge docs
reubenharry May 15, 2022
af06b06
notebooks
reubenharry May 15, 2022
810aad9
Merge branch 'notebook' into histogram
reubenharry May 15, 2022
aab8095
notebooks: sampling
reubenharry May 15, 2022
8d143d2
merge cleanup
reubenharry May 15, 2022
0b94b79
merge notebook
reubenharry May 15, 2022
5287df4
cleaning up notebooks
reubenharry May 16, 2022
5dddb4f
remove helpers
reubenharry May 16, 2022
b829b03
Merge branch 'cleanup' into notebook
reubenharry May 16, 2022
a6d29da
when
reubenharry May 16, 2022
e7652d7
no helpers
reubenharry May 16, 2022
62319ee
models: lts
reubenharry May 16, 2022
9cff121
models: lts
reubenharry May 16, 2022
1cf551e
SMC notebook
reubenharry May 16, 2022
c2988d6
merged
reubenharry May 16, 2022
fd97ec3
weighted sampling
reubenharry May 17, 2022
2e6dd30
formatting
reubenharry May 18, 2022
70426f6
Merge branch 'notebook' into cleanup
reubenharry May 18, 2022
9f887b3
working on analytic
reubenharry May 18, 2022
6d36c1c
test enumerator
reubenharry May 18, 2022
6d4839c
discovered a bug in Integrator
reubenharry May 18, 2022
13cc36f
first attempt to fix normalization bug
reubenharry May 18, 2022
e19b2db
bug seems to be fixed
reubenharry May 18, 2022
bf7bdbf
update tests
reubenharry May 18, 2022
96ea8df
working tests: normalNormal goes awry if observations are huge
reubenharry May 19, 2022
328a16f
cleanup
reubenharry May 19, 2022
f378401
Bayesian in class, and remove LinearWithOutliers
reubenharry May 19, 2022
4b3e4e5
model updates
reubenharry May 23, 2022
3e21cff
yaml
Jun 4, 2022
77df67d
prepare for PR
Jun 6, 2022
93cf699
address warnings
Jun 6, 2022
9782c8c
review
Jun 6, 2022
4d8e98c
more warnings fixed
Jun 8, 2022
ea6cb50
merged
reubenharry Jun 9, 2022
51268c7
less warnings
Jun 9, 2022
9978861
update ghc version stated in cabal
reubenharry Jun 9, 2022
1aef717
Merge branch 'cleanup' of github.com:tweag/monad-bayes into cleanup
reubenharry Jun 9, 2022
8e32fbf
remove warnings
Jun 9, 2022
16f5e11
merge main
Jun 9, 2022
6c78b06
fix build
Jun 9, 2022
7c69878
version bump
Jun 9, 2022
8174b55
merged but tests fail
Jun 9, 2022
0fc99be
remove safe
Jun 9, 2022
d7011b5
Merge branch 'cleanup' into histogram
Jun 9, 2022
b293eca
fix merge
Jun 9, 2022
54b0165
fix rmsmc bug you introduced
reubenharry Jun 9, 2022
cb6cc06
integrator as MonadInfer
reubenharry Jun 10, 2022
3777955
updates
reubenharry Jun 10, 2022
165bd24
updates
reubenharry Jun 10, 2022
9fd5c00
updates
reubenharry Jun 10, 2022
e2ee3a9
fix the integrator
reubenharry Jun 10, 2022
b418558
fix the normalization of the integrator
reubenharry Jun 10, 2022
59ef114
merged
reubenharry Jun 11, 2022
fe29850
inference
reubenharry Jun 11, 2022
ae07cda
rope in analyticT
reubenharry Jun 13, 2022
7a1f775
remove integratorT
Jun 13, 2022
3e388f4
nix notebooks
Jun 14, 2022
1dad9f5
notebooks
Jun 14, 2022
fdb0177
rmsmc
Jun 14, 2022
0e074a8
lazy ppl
Jun 14, 2022
0a2dd21
hoistFirst bug again
Jun 15, 2022
44d3363
hoistFirst bug again
Jun 15, 2022
935ad8f
diagrams
Jun 15, 2022
277cb63
fp notebook
Jun 15, 2022
021aa60
remove integratorT
Jun 15, 2022
757d8cc
Merge branch 'cleanup' into histogram
Jun 15, 2022
ea01eaa
Merge branch 'histogram' into analytic
Jun 15, 2022
463b12f
changelog and version bump
Jun 15, 2022
20d7c21
Merge branch 'models' into lazyppl
Jun 15, 2022
2929ffa
presentation
Jun 16, 2022
c791cc9
notebooks
Jun 17, 2022
96d6523
notebooks
Jun 17, 2022
d1c6a19
cleanup
Jun 17, 2022
3faeb6f
docs
Jun 18, 2022
20275a5
docs
Jun 18, 2022
8ad0eac
presentation
Jun 20, 2022
5b55ea4
update talk
Jun 21, 2022
0b56e10
notebook
Jun 23, 2022
2d7ff29
works, but readd basic
Jun 27, 2022
99575ec
first working example of marginalization
Jun 27, 2022
c0cae57
renaming
Jun 28, 2022
3b3eb3d
continue cleanup
Jun 28, 2022
a71dabe
tests: not yet integrated in
Jun 28, 2022
343acde
tests: integrated in
Jun 28, 2022
49b72a7
bayesian pmmh
Jun 28, 2022
b4baf7b
switch over to new smc
Jun 28, 2022
bb4304c
switch over to new smc
Jun 28, 2022
d6beee0
improve resampleGeneric sum
Jun 28, 2022
78a0d52
independent in Class
Jun 28, 2022
20cf17b
parser
reubenharry Jul 5, 2022
74d1858
nbs
reubenharry Jul 8, 2022
3bdba31
no numSteps
reubenharry Jul 15, 2022
b9fa06c
make clean
reubenharry Jul 15, 2022
8c5e533
merge
reubenharry Jul 15, 2022
9f414a3
fix tests
reubenharry Jul 15, 2022
f096bf0
fix tests
reubenharry Jul 15, 2022
22c20b8
fix tests
reubenharry Jul 15, 2022
4849cb7
fix tests
reubenharry Jul 15, 2022
bb82c1b
pmmh
reubenharry Jul 15, 2022
5453626
withParticles
reubenharry Jul 15, 2022
4890596
withParticles
reubenharry Jul 15, 2022
4f7e666
Sequential.Free
reubenharry Jul 18, 2022
18c3fc1
Merge branch 'api' into sequentialCata
reubenharry Jul 18, 2022
c1c83af
Sequential.Free
reubenharry Jul 18, 2022
1d51b15
Density.Free + docs
reubenharry Jul 18, 2022
ce61682
docs
reubenharry Jul 18, 2022
be6331b
docs
reubenharry Jul 18, 2022
26d3d9e
Sampler.Strict
reubenharry Jul 18, 2022
0f8bd34
update nix
reubenharry Jul 25, 2022
72c5a79
cabal format
reubenharry Jul 25, 2022
d64962c
update nix
reubenharry Jul 25, 2022
7245f09
fix build
reubenharry Jul 25, 2022
86a72ef
merge sampler changes, fix changelog
reubenharry Jul 26, 2022
409b14d
bump changelog and version. bump ghc version to 9.2.3
reubenharry Jul 26, 2022
f99ed20
update docs and change mcmc so unweighted is not needed
reubenharry Jul 26, 2022
bc44e0f
remove sequentialCata
reubenharry Jul 28, 2022
331171c
Merge branch 'simplifyFreeSampler' into lazyppl
reubenharry Jul 28, 2022
73bee27
nbs
reubenharry Jul 28, 2022
95fbea3
nbs
reubenharry Jul 28, 2022
6f1275c
add docs and Distribution, Measure, Kernel
reubenharry Jul 28, 2022
b7e9bdc
merge
reubenharry Jul 28, 2022
eacf068
fix cabal
reubenharry Jul 28, 2022
45e9bf0
merge
reubenharry Jul 28, 2022
a62c5b5
Merge branch 'lazyppl' into notebooks
reubenharry Jul 28, 2022
fee39c5
updating notebooks
reubenharry Jul 28, 2022
9dd2cba
Merge branch 'lazyppl' into notebooks
reubenharry Jul 28, 2022
dd6c37f
Merge branch 'distribution' into notebooks
reubenharry Jul 28, 2022
215d5c6
updated notebooks
reubenharry Aug 1, 2022
89d029c
mhTrans'
reubenharry Aug 1, 2022
0b23adb
fix mh
reubenharry Aug 1, 2022
897addb
Merge branch 'simplifyFreeSampler' into lazyppl
reubenharry Aug 1, 2022
533d89d
merge
reubenharry Aug 1, 2022
fe722ff
website
reubenharry Aug 1, 2022
74aadb4
website
reubenharry Aug 2, 2022
9bce8cf
bayesian
reubenharry Aug 2, 2022
ae5f8b2
fix import
reubenharry Aug 5, 2022
5a24488
fix pmmh example
reubenharry Aug 9, 2022
a70b644
tui
reubenharry Aug 16, 2022
fdb8dda
no streamly
reubenharry Aug 16, 2022
84f37f5
lint
reubenharry Aug 26, 2022
ebb395e
lint
reubenharry Aug 26, 2022
a481550
piped mcmc
reubenharry Aug 26, 2022
6c6c9f5
merge
reubenharry Aug 30, 2022
a123b93
merge
reubenharry Aug 30, 2022
b300ca4
fixing nix
reubenharry Sep 9, 2022
df35f54
notebooks
reubenharry Sep 11, 2022
d63036d
notebooks
reubenharry Sep 11, 2022
01c15f9
docs
reubenharry Sep 13, 2022
8971c05
Merge branch 'master' into notebooks
reubenharry Sep 14, 2022
3ae8e37
fix tui
reubenharry Sep 14, 2022
683c83a
new notebook
reubenharry Sep 18, 2022
f404792
new notebook
reubenharry Sep 19, 2022
64fc6d9
Update README.md
reubenharry Sep 22, 2022
1eedfaf
Update monad-bayes-site/about.md
reubenharry Sep 22, 2022
8e3f5f2
Update monad-bayes-site/about.md
reubenharry Sep 22, 2022
5973638
Update docs/source/probprog.md
reubenharry Sep 22, 2022
0a1b3fe
Update docs/source/probprog.md
reubenharry Sep 22, 2022
4ef6945
Update README.md
reubenharry Sep 23, 2022
bc254a6
Remove notebook's checkpoints and add them to .gitignore (#200)
mknorps Oct 3, 2022
ba0647c
Update monad-bayes-site/about.md
mknorps Oct 4, 2022
7512204
Maria/website adjustments (#201)
mknorps Oct 4, 2022
2ef32d9
UPD readme with clearer instruction of notebooks run
mknorps Oct 5, 2022
52eb47f
chore(deps): bump DeterminateSystems/update-flake-lock from 13 to 14 …
dependabot[bot] Sep 15, 2022
2d26606
flake.lock: Update (#195)
github-actions[bot] Sep 30, 2022
8c2079e
flake.lock: Update (#199)
github-actions[bot] Oct 3, 2022
d741800
Update nix flakes lock
mknorps Oct 6, 2022
74c4066
UPD envrc (#202)
mknorps Oct 7, 2022
8e622b5
Notebooks: Introduction.ipynb adjustments (#205)
mknorps Oct 7, 2022
a63ee33
DEL duplicate notebooks in the folder
mknorps Oct 12, 2022
cdb36f2
resolve warnings - shadowed variables and unused imports
mknorps Oct 12, 2022
b4904b7
DEL Wiener process (#215)
mknorps Oct 15, 2022
78cb51b
Merge branch 'master' into notebooks
mknorps Oct 18, 2022
47c7595
Manual update of flake.lock
mknorps Oct 18, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .envrc
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
# use flake
# Make sure you have direnv >= 2.30
use flake --extra-experimental-features nix-command --extra-experimental-features flakes
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ docs/build
.hpc
.hsenv
.vscode
.envrc
.stack-work/
cabal-dev
cabal.project.local
Expand All @@ -34,4 +35,6 @@ dist-*
/.pre-commit-config.yaml
/.direnv
venv/
.venv/
.ipynb_checkpoints/
.netlify/
18 changes: 11 additions & 7 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@
# 0.2.0 (2021-07-26)
# 1.0.0 (2022-09-10)

- host website from repo
- host notebooks from repo
- use histogram-fill

# 0.2.0 (2022-07-26)

- rename various functions to match the names of the corresponding types (e.g. `Enumerator` goes with `enumerator`)
- add configs as arguments to inference methods `smc` and `mcmc`
Expand All @@ -7,11 +13,11 @@
- update history of changelog in line with semantic versioning conventions
- bumped to GHC 9.2.3

# 0.1.5 (2021-07-26)
# 0.1.5 (2022-07-26)

- Refactor of sampler to be parametric in the choice of a pair of IO monad and RNG

# 0.1.4 (2021-06-15)
# 0.1.4 (2022-06-15)

Addition of new helper functions, plotting tools, tests, and Integrator monad.

Expand All @@ -21,8 +27,7 @@ Addition of new helper functions, plotting tools, tests, and Integrator monad.
- new tests, including with conjugate distributions to compare analytic solution against inferred posterior
- `models` directory is cleaned up. New sequential models using `pipes` package to represent monadic streams


# 0.1.3 (2021-06-08)
# 0.1.3 (2022-06-08)

Clean up of unused functions and broken code

Expand All @@ -31,14 +36,13 @@ Clean up of unused functions and broken code
- explicit imports
- added some global language extensions

# 0.1.2 (2021-06-08)
# 0.1.2 (2022-06-08)

Add documentation

- docs written in markdown
- docs built by sphinx


# 0.1.1 (2020-04-08)

- New exported function: `Control.Monad.Bayes.Class` now exports `discrete`.
Expand Down
9 changes: 9 additions & 0 deletions MAINTAINERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,12 @@ A **new major GHC version** has been released. Here's what you need to do:

[ghc-major]: https://gitlab.haskell.org/ghc/ghc/wikis/working-conventions/releases#major-releases
[hackage-pvp]: https://pvp.haskell.org/


## Documentation

The docs are built with Sphinx. Once installed, cd to the `docs` directory, then run `make html` to build locally. CI does this automatically, so to update the docs, just update the markdown (e.g. docs/source/usage.md), and push.

## Website

The website is also hosted in the repo (`/monad-bayes-site`), and is built with `hakyll`. Do `stack exec site build` to build. CI **does not** automatically build the site, so to update, you will need to run this command, and only then push to github.
16 changes: 10 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# [Monad-Bayes](https://monad-bayes-site.netlify.app/_site/about.html)

A library for probabilistic programming in Haskell.

<!-- [![Hackage](https://img.shields.io/hackage/v/monad-bayes.svg)](https://hackage.haskell.org/package/monad-bayes)
[![Stackage](http://stackage.org/package/monad-bayes/badge/lts)](http://stackage.org/lts/package/monad-bayes)
[![Hackage Deps](https://img.shields.io/hackage-deps/v/monad-bayes.svg)](http://packdeps.haskellers.com/reverse/monad-bayes)
Expand All @@ -11,12 +13,11 @@

<!-- See the [documentation](https://monad-bayes.netlify.app/) for a quick-start user guide and a reference overview of how it all works. -->

Created by [Adam Scibior][adam-web] ([@adscib][adam-github]), documentation, website and various extras by [Reuben][reuben-web], maintained by [Tweag I/O][tweagio].
Created by [Adam Scibior][adam-web] ([@adscib][adam-github]), documentation, website and newer features by [Reuben][reuben-web], maintained by [Tweag][tweagio].

## Project status

Now that `monad-bayes` has been released on Hackage, we will focus on improving
documentation, adding a variety of applications, improving the API, and making inference more easily customizable.
Now that `monad-bayes` has been released on Hackage, and the documentation and the API has been updated, we will focus on adding new features. See the Github issues to get a sense of what is being prepared, and please feel free to make requests.

## Background

Expand Down Expand Up @@ -48,11 +49,14 @@ for probabilistic programs][thesis-doi]. Thesis. University of Cambridge.

Now you can use `stack build`, `stack test` and `stack ghci`.

To use the notebooks in the `notebooks` directory, you will first need `nix`. Then:
**To view the notebooks, go to the website**. To use the notebooks interactively:

1. Run `nix develop --extra-experimental-features nix-command --extra-experimental-features flakes`
1. Compile the source: `stack build`
2. If you do not have `nix` [install it](https://nixos.org/download.html).
3. Run `nix develop --system x86_64-darwin --extra-experimental-features nix-command --extra-experimental-features flakes` - this should open a nix shell. For Linux use `x86_64-linux` for `--system` option instead.
4. Run `jupyter-lab` from the nix shell to load the notebooks.

2. This should open a shell, from which you can run `jupyter-lab` to load the notebooks
Your mileage may vary.
mknorps marked this conversation as resolved.
Show resolved Hide resolved

[adam-github]: https://github.com/adscib
[adam-web]: https://www.cs.ubc.ca/~ascibior/
Expand Down
4 changes: 1 addition & 3 deletions benchmark/Single.hs
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,8 @@ import Control.Monad.Bayes.Inference.SMC
smc,
)
import Control.Monad.Bayes.Population
import Control.Monad.Bayes.Population (population)
import Control.Monad.Bayes.Sampler.Strict
import Control.Monad.Bayes.Traced
import Control.Monad.Bayes.Traced hiding (model)
import Control.Monad.Bayes.Weighted
import Control.Monad.ST (runST)
import Data.Time (diffUTCTime, getCurrentTime)
Expand All @@ -31,7 +30,6 @@ import Options.Applicative
option,
short,
)
import System.Random.MWC (GenIO, createSystemRandom)

data Model = LR Int | HMM Int | LDA (Int, Int)
deriving stock (Show, Read)
Expand Down
58 changes: 40 additions & 18 deletions docs/source/probprog.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# What is probabilistic programming
# Quickstart

Probabilistic programming is all about being able to write probabilistic models as programs. For instance, here is a Bayesian linear regression model, which we would write equationally as:

Expand All @@ -18,39 +18,38 @@ y_{n}=\alpha+\beta x_{n}+\epsilon_{n}
but in code as:

```haskell
paramPriorRegression :: MonadSample m => m (Double, Double, Double)
paramPriorRegression :: Distribution (Double, Double, Double)
paramPriorRegression = do
slope <- normal 0 2
intercept <- normal 0 2
noise <- gamma 4 4
return (slope, intercept, noise)

regression :: (MonadInfer m) => [Double] -> [Double] -> m (Double, Double, Double)
regression xs ys = do
params@(slope, intercept, noise) <- paramPriorRegression
forM_ (zip xs ys) \(x, y) -> factor $ normalPdf (slope * x + intercept) (sqrt noise) y
regression :: Kernel [(Double, Double)] (Double, Double, Double)
regression xsys = do
(slope, intercept, noise) <- paramPriorRegression
forM_ xsys \(x, y) -> factor $ normalPdf (slope * x + intercept) (sqrt noise) y
return (slope, intercept, noise)
```

`regression` takes observations of `xs` and `ys`, and using the prior expressed by `paramPriorRegression`, returns the posterior conditioned on the observations.
`regression` takes observations (a list of pairs of x and y values), and using the prior expressed by `paramPriorRegression`, returns the posterior conditioned on the observations.

This is the *model*. To perform *inference* , suppose we have a data set of `xs` and `ys` like:
This is the *model*. To perform *inference* , suppose we have a data set `xsys` like:

![](_static/priorpred.png)

We could then run the model as follows:
To run the model

```haskell
mhRunsRegression = sampler
$ unweighted
$ mcmc MCMCConfig
{numMCMCSteps = 1500,
proposal = SingleSiteMH,
numBurnIn = 500}
random
(regression xsys)
```

This yields 1000 samples from an MCMC walk using an MH kernel. `mh n` produces a distribution over chains of length `n`, along with the probability of that chain. `prior` throws away that weight (we don't care about the probability of the chain itself), and `sampler` samples a particular chain. Plotting one gives:
This yields 1000 samples from an MCMC walk using an MH kernel. `mh n` produces a distribution over chains of length `n`, along with the probability of that chain. Sampling a chain and plotting its final state gives:

![](_static/regress.png)

Expand All @@ -74,15 +73,22 @@ A distribution in monad-bayes over a set {math}`X`, is of type:
MonadInfer m => m X
```

monad-bayes provides standard distributions, such as:
For beginner friendliness, a synonym `type Measure a = forall m . MonadSample m => m a` is provided (as well as `Distribution` shown above, for normalized distributions, and `Kernel` for functions into distributions).

- `random :: MonadInfer m => m Double` : sample uniformly from {math}`[0,1]`
Monad-bayes provides standard distributions, such as

```haskell
random :: Distribution Double
```

which is distributed uniformly over {math}`[0,1]`.

The full set is listed at https://hackage.haskell.org/package/monad-bayes-0.1.1.0/docs/Control-Monad-Bayes-Class.html
mknorps marked this conversation as resolved.
Show resolved Hide resolved

Note that these primitives already allows us to construct quite exotic distributions, like the uniform distribution over `(+) :: Int -> Int -> Int` and `(-) :: Int -> Int -> Int`:

```haskell
distributionOverFunctions :: Distribution (Int -> Int -> Int)
distributionOverFunctions = uniformD [(+), (-)]
```

Expand Down Expand Up @@ -261,7 +267,7 @@ which gives
[([1,2,3,4],0.5),([2,3,4,5],0.5)]
```

## Near exact inference for continuous distributions
### Near exact inference for continuous distributions

Monad-Bayes does not currently support exact inference (via symbolic solving) for continuous distributions. However, it *does* support numerical integration. For example, for the distribution defined by

Expand Down Expand Up @@ -401,6 +407,12 @@ The final element of the chain is the head of the list, so you can drop samples

You can also run `MCMC` using `mcmcP`. This creates an infinite chain, expressed as a stream or using the corresponding type from the `pipes` library, a `Producer`. This is a very natural representation of a random walk in Haskell.

You can run this with a terminal user interface (TUI) by doing e.g.

```haskell
tui 0 random noVisual
```

## Sequential Monte Carlo (Particle Filtering)

Run SMC with two resampling steps and two particles as follows, given a model `m`:
Expand All @@ -419,6 +431,15 @@ output =
<!-- Or if you prefer, think of the inference method as:


```haskell
(sampler . population . smc SMCConfig {numSteps = 2, numParticles = 2, resampler = resampleMultinomial} random)
:: Sequential (Population SamplerIO) a -> IO [(a, Numeric.Log.Log Double)]
``` -->

<!-- `Sequential (Population SamplerIO)` is an instance of `MonadInfer`, so we can apply this inference method to any distribution. For instance, to use our now familiar `example`: -->

As a concrete example, here is a probabilistic program:

```haskell
(sampler . population . smc SMCConfig {numSteps = 2, numParticles = 2, resampler = resampleMultinomial} random)
:: Sequential (Population SamplerIO) a -> IO [(a, Numeric.Log.Log Double)]
Expand Down Expand Up @@ -454,7 +475,7 @@ The result:

Each of these is a particle with a weight. In this simple case, there are all identical - obviously in general they won't be.

`numSteps` is the number of steps that the `SMC` algorithm takes, i.e. how many times it resamples. This should generally be the number of factor statements in the program. `numParticles` is the size of the population. Larger is better but slower.
`numSteps` is the number of steps that the `SMC` algorithm takes, i.e. how many times it resamples. Specify it as either `All` or `Only n` for `n` an integer. This should generally be the number of factor statements in the program. `numParticles` is the size of the population. Larger is better but slower.

`resampler` is the mechanism used to resampling the population of particles after each `factor` statement.

Expand Down Expand Up @@ -672,7 +693,7 @@ For API docs in the normal Haskell style, see [hackage](https://hackage.haskell.

# Monad-Bayes vs other libraries

Monad-bayes is a universal probabilistic programming language, in the sense that you can express any computable distribution. In this respect it differs from Stan, which focuses instead on handling inference on an important subset well.
Monad-bayes is a universal probabilistic programming system, in the sense that you can express any computable distribution. In this respect it differs from Stan, which focuses instead on handling inference on an important subset well.

There is a variety of universal probabilistic programming libraries and/or languages, which include WebPPL, Gen, Pyro and Edward.

Expand All @@ -682,6 +703,7 @@ A lot of engineering work has been put into the above libraries and languages to

**What Monad-Bayes has that is unique**:

Monad-Bayes is just a library, unlike almost all other PPLs, which are separate languages written inside another language. As such, probabilistic programs in monad-bayes are first class programs in Haskell with no new special syntax or keywords. This allows all of Haskell's expressive power to be brought to bear. You can write distributions over any datatype (lists, trees, functions, histograms, JSON files, graphs, diagrams, etc). You can use powerful libraries like `pipes`, `lens` and `Parsec`. Everything is pure. Everything is strongly typed. You can make use of laziness.

Models are monadic and inference is modular. Complex inference algorithms like RMSMC or PMMH are built out of simple composable pieces, and so are expressable extraordinarily simply.

Probabilistic programs in monad-bayes are first class programs in Haskell. There's no new special syntax or keywords. This allows all of Haskell's expressive power to be brought to bear. You can write distributions over any datatype (lists, trees, functions, histograms, JSON files, graphs, diagrams, etc). You can use powerful libraries like Pipes, lens and Parsec. Everything is pure. Everything is strongly typed. You can make use of laziness.
18 changes: 8 additions & 10 deletions docs/source/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ That's enough to understand the core ideas, but for the more advanced content, y

## References

monad-bayes is the codebase accompanying the theory of probabilistic programming described in [this paper](https://www.denotational.co.uk/publications/scibior-kammar-ghahramani-funcitonal-programming-for-modular-bayesian-inference.pdf).
monad-bayes is the codebase accompanying the theory of probabilistic programming described in [this paper](https://arxiv.org/pdf/1711.03219.pdf).

## The core typeclasses

Expand Down Expand Up @@ -383,7 +383,7 @@ pushEvidence ::

In other words, `pushEvidence` takes a `Population m a` where `m` is a `MonadCond` instance. It takes the sum of the weights, divides the weights by it, and then factors by the sum in `m`.

### Sequential
### Sequential

Summary of key info on `Sequential`:

Expand Down Expand Up @@ -477,6 +477,7 @@ hoistFirst f = Sequential . Coroutine . f . resume . runSequential

When `m` is `Population n` for some other `n`, then `resampleGeneric` gives us one example of the natural transformation we want. In other words, operating in `Sequential (Population n)` works, and not only works but does something statistically interesting: particle filtering (aka SMC).

**Note**: the running of `Sequential`, i.e. getting from `Sequential m a` to `m a` is surprisingly subtle, and there are many incorrect ways to do it, such as plain folds of the recursive structure. These can result in a semantics in which the transformation gets applied an exponentially large number of times.



Expand Down Expand Up @@ -704,20 +705,17 @@ The version of MCMC in monad-bayes performs its random walk on program traces of
A single step in this chain (in Metropolis Hasting MCMC) looks like this:

```haskell
mhTrans :: MonadSample m =>
Weighted (Density m) a -> Trace a -> m (Trace a)
mhTrans m t@Trace {variables = us, density = p} = do
mhTrans :: MonadSample m => (Weighted (State.Density m)) a -> Trace a -> m (Trace a)
mhTrans m t@Trace {variables = us, probDensity = p} = do
let n = length us
us' <- do
i <- discrete $ discreteUniformAB 0 (n -1)
i <- discrete $ discreteUniformAB 0 (n - 1)
u' <- random
case splitAt i us of
(xs, _ : ys) -> return $ xs ++ (u' : ys)
_ -> error "impossible"
((b, q), vs) <-
runWriterT $ weighted $ Weighted.hoist (WriterT . density us') m
let ratio = (exp . ln) $ min 1
(q * fromIntegral n / (p * fromIntegral (length vs)))
((b, q), vs) <- State.density (weighted m) us'
let ratio = (exp . ln) $ min 1 (q * fromIntegral n / (p * fromIntegral (length vs)))
accept <- bernoulli ratio
return $ if accept then Trace vs b q else t
```
Expand Down
12 changes: 6 additions & 6 deletions flake.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading