Computing Tight Differential Privacy Guarantees Using FFT

Show full item record



Koskela , A , Honkela , A & Jälkö , J 2019 , ' Computing Tight Differential Privacy Guarantees Using FFT ' , Privacy in Machine Learning: NeurIPS 2019 workshop , Vancouver , Kanada , 14/12/2019 - 14/12/2019 .

Title: Computing Tight Differential Privacy Guarantees Using FFT
Author: Koskela, Antti; Honkela, Antti; Jälkö, Joonas
Contributor: Helsingin yliopisto, Tietojenkäsittelytieteen osasto
Helsingin yliopisto, Antti Honkela / Vastuullinen tutkija
Date: 2019-12-14
Language: fin
Abstract: Computing privacy parameters for the differentially private stochastic gradient descent method (DP-SGD) is equivalent to analysing one dimensional mechanisms. We propose a numerical accountant for evaluating the (ε, δ)-privacy loss for mech- anisms with continuous one dimensional output. The proposed method is based on a numerical approximation of an integral formula which gives the tight (ε, δ)-values. The approximation is carried out by discretising the integral and by evaluating the resulting discrete convolutions using the fast Fourier transform algorithm. We focus on the subsampled Gaussian mechanism which underlies DP-SGD. We give both theoretical error bounds and numerical error estimates for the approximation. Experimen- tal comparisons with state-of-the-art techniques demonstrate significant improvements in bound tightness and/or computation time. Python code for the method can be found in Github (
Subject: 113 Tietojenkäsittely- ja informaatiotieteet

Files in this item

Total number of downloads: Loading...

Files Size Format View
poster_computing_tight_DP_2.pdf 199.2Kb PDF View/Open

This item appears in the following Collection(s)

Show full item record