Conciliating Privacy and Utility in Data Releases via Individual Differential Privacy and Microaggregation
Jordi Soria-Comas(a), David Sánchez(b), Josep Domingo-Ferrer(b), Sergio Martínez(b),(*), Luis Del Vasto-Terrientes(b)
Transactions on Data Privacy 18:1 (2025) 29 - 50
Abstract, PDF
(a) Universitat Oberta de Catalunya, Rambla del Poblenou 156, Barcelona, 08018, Catalonia.
(b) Department of Computer Engineering and Mathematics, CYBERCAT-Center for Cybersecurity Research of Catalonia, Universitat Rovira i Virgili, Av. Paisos Catalans 26, Tarragona, 43007, Catalonia.
e-mail:jordi_sc @uoc.edu; david.sanchez @urv.cat; josep.domingo @urv.cat; sergio.martinezl @urv.cat; luismiguel.delvasto @fundacio.urv.cat
|
Abstract
ε-Differential privacy (DP) is a well-known privacy model that offers strong privacy guarantees. However, when applied to data releases, DP significantly deteriorates the analytical utility of the protected outcomes. To keep data utility at reasonable levels, practical applications of DP to data releases have used weak privacy parameters (large ε), which dilute the privacy guarantees of DP. In this work, we tackle this issue by using an alternative formulation of the DP privacy guarantees, named ε-individual differential privacy (iDP), which causes less data distortion while providing the same protection as DP to subjects. We enforce iDP in data releases by relying on attribute masking plus a pre-processing step based on data microaggregation. The goal of this step is to reduce the sensitivity to record changes, which determines the amount of noise required to enforce iDP (and DP). Specifically, we propose data microaggregation strategies designed for iDP whose sensitivities are significantly lower than those used in DP. As a result, we obtain iDP-protected data with significantly better utility than with DP. We report on experiments that show how our approach can provide strong privacy (small ε) while yielding protected data that do not significantly degrade the accuracy of secondary data analysis.
|