Back to glossary

AI GLOSSARY

Differential Privacy

Privacy & Data Governance

A mathematical framework, formalized by Cynthia Dwork and colleagues in 2006, for adding carefully calibrated random noise to data or model outputs in a way that protects individual privacy while preserving the statistical usefulness of the overall dataset. Differential privacy provides a formal, quantifiable privacy guarantee: it can be proven that the presence or absence of any single individual in the dataset cannot be meaningfully inferred from the output.
See also: de-identification, data minimization, Privacy.

External reference