News, Technology

Facebook open-sources Opacus, a PyTorch training library with differential privacy

Written by Hamnah Khalid ·  1 min read >

Facebook has open-sourced Opacus, a PyTorch training library that uses Differential Privacy (DP) to train models.

Facebook says that with this act, it hopes engineers everywhere will find it easier to adapt the Differential Privacy technique of data protection.

Differential Privacy essentially involves injecting a small amount of noise during the training process of the model. This act eventually makes it harder to extract the original data out of the trained model. In other words, the algorithm used by Opacus is considered differentially private because when it is used to train a model, an observer cannot tell if a particular information instance was used while training.

Opacus aims to preserve the privacy of any real data acquired for training purposes whilst simultaneously limiting any negative effects on the performance of the trained model. “Our approach is centered on differentially private stochastic gradient descent,” says Facebook in a blog post.

The core idea behind this algorithm is that we can protect the privacy of a training dataset by intervening on the parameter gradients that the model uses to update its weights, rather than the data directly.” This means that with every passing iteration in the training phase, Opacus will add noise to the parameter gradients as opposed to the data itself, stopping the algorithm from memorising any training samples given to it. The noise will eventually cancel out as training samples for Machine Learning models generally tend to be large in their numbers.

Facebook observes that over the past few years, a rapid growth in the privacy-preserving machine learning (PPML) community has been seen. These communities can greatly benefit from a library like Opacus.

Opacus provides its users with higher training speeds, better data protection, and the flexibility of using the training algorithm alongside your original PyTorch code. Opacus also comes with numerous tutorials to help developers along the way.

With the release of Opacus, we hope to provide an easier path for researchers and engineers to adopt differential privacy in ML, as well as to accelerate DP research in the field.