The Relative Gaussian Mechanism and its Application to Private Gradient Descent - Apprentissage de modèles visuels à partir de données massives Access content directly
Preprints, Working Papers, ... Year : 2023

The Relative Gaussian Mechanism and its Application to Private Gradient Descent

Abstract

The Gaussian Mechanism (GM), which consists in adding Gaussian noise to a vector-valued query before releasing it, is a standard privacy protection mechanism. In particular, given that the query respects some L2 sensitivity property (the L2 distance between outputs on any two neighboring inputs is bounded), GM guarantees Rényi Differential Privacy (RDP). Unfortunately, precisely bounding the L2 sensitivity can be hard, thus leading to loose privacy bounds. In this work, we consider a Relative L2 sensitivity assumption, in which the bound on the distance between two query outputs may also depend on their norm. Leveraging this assumption, we introduce the Relative Gaussian Mechanism (RGM), in which the variance of the noise depends on the norm of the output. We prove tight bounds on the RDP parameters under relative L2 sensitivity, and characterize the privacy loss incurred by using output-dependent noise. In particular, we show that RGM naturally adapts to a latent variable that would control the norm of the output. Finally, we instantiate our framework to show tight guarantees for Private Gradient Descent, a problem that naturally fits our relative L2 sensitivity assumption.
Fichier principal
Vignette du fichier
2308.15250.pdf (544.39 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04370596 , version 1 (03-01-2024)

Licence

Attribution

Identifiers

Cite

Hadrien Hendrikx, Paul Mangold, Aurélien Bellet. The Relative Gaussian Mechanism and its Application to Private Gradient Descent. 2023. ⟨hal-04370596⟩
36 View
11 Download

Altmetric

Share

Gmail Facebook X LinkedIn More