Skip to content

Curated notebooks on how to train neural networks using differential privacy and federated learning.

License

Notifications You must be signed in to change notification settings

Ayeps/differential-privacy-federated-learning

 
 

Repository files navigation

Differential Privacy & Federated Learning.

Curated notebooks on how to train neural networks using differential privacy and federated learning.

What is differential privacy?

Differential Privacy is a set of techniques for preventing a model from accidentally memorizing secrets present in a training dataset during the learning process.

The key points under Differential Privacy are:

  • Make a promise to a data subject that: You won’t be affected, adversely or otherwise, by allowing your data to be used in any analysis, no matter what studies, datasets or information sources, are available.
  • Ensure that the model learning from sensitive data are only learning what they are supposed to learn without accidentally learning what they are not supposed to learn from their data

Federated Learning

Instead of bringing data all to one place for training, federated learning is done by bringing the model to the data. This allows a data owner to maintain the only copy of their information.

About

Curated notebooks on how to train neural networks using differential privacy and federated learning.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%