Skip to content

MMD, Hausdorff and Sinkhorn divergences scaled up to 1,000,000 samples.

License

Notifications You must be signed in to change notification settings

jeanfeydy/global-divergences

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

N.B.: This repository is out of date. A reference implementation of Optimal Transport divergences for shape registration is now available on the geomloss repository: website, pip package.

Global divergences

This repository provides efficient implementations of Maximum Mean Discrepancies (aka. kernel norms), Hausdorff and Sinkhorn divergences between sampled measures. Thanks to the KeOps library, our routines scale up to batches of 1,000,000 samples, without memory overflows.

N.B.: As of today, KeOps is still in beta. The 0.1 version will be released on pip by the end of October, including a new documentation, Windows support and a bug fix for high-dimensional vectors.

Information on the subject is available in our papers:

First and foremost, this repo is about providing a reference implementation of Sinkhorn-related divergences. In /common/, you will find a simple and an efficient implementation of the Sinkhorn algorithm. The folder global_divergences_ShapeMI2018 will let you reproduce the figures of our ShapeMI paper (Miccai 2018 workshop), while sinkhorn_entropies contains those of our reference theoretical article.

Please note that to run some of our demos, you will need to install both pytorch and KeOps. No worries: pip install pykeops should do the trick.

About

MMD, Hausdorff and Sinkhorn divergences scaled up to 1,000,000 samples.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages