Sign In

Communications of the ACM

ACM TechNews

A New Way to Look at Data Privacy

View as: Print Mobile App Share:
In several cases, the researchers show the amount of noise required to protect sensitive data from adversaries is far less with PAC Privacy than with other approaches.

MIT researchers created a new data privacy metric, Probably Approximately Correct (PAC) Privacy.

Credit: Jose-Luis Olivares/MIT, iStock

A new metric developed by Massachusetts Institute of Technology (MIT) researchers allows a small amount of noise to be added to models to protect sensitive data while maintaining the model's accuracy.

An accompanying framework to the Probably Approximately Correct (PAC) Privacy metric automatically identifies the minimal amount of noise to add without having to know the model's inner workings.

PAC Privacy considers the difficulty for an adversary to reconstruct sensitive data after the addition of noise, and determines the optimal amount of noise based on entropy in the original data from the adversary's viewpoint.

It runs the user's machine learning training algorithm numerous times on different subsamplings of data, comparing the variance across all outputs to calculate how much noise must be added.

From MIT News
View Full Article


Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account