Sign In

Communications of the ACM

ACM TechNews

Can Foundation Models Help Us Achieve 'Perfect Secrecy'?

View as: Print Mobile App Share:

If we expect digital assistants to facilitate personal tasks that involve a mix of public and private data, we’ll need the technology to provide “perfect secrecy,” or the highest possible level of privacy, in certain situations.


Stanford University's Simran Arora and Christopher Ré are explorijg whether emerging foundation models can help to realize the highest level of privacy, or "perfect secrecy."

Arora said perfect secrecy ensures that as users interact with the system, the likelihood of adversaries learning private information does not increase; neither does the probability of accidental data exposure when multiple personal tasks are completed using the same data.

Arora and Ré devised the Foundation Model Controls for User Secrecy (FOCUS) framework for conducting personal tasks without sacrificing privacy via a one-way data flow.

FOCUS accommodates personal data privacy and also hides the task the model was asked to complete, as well as how it was executed.

The model also rivals federated learning on six of seven standard benchmarks, and boosts efficiency through inference.

From Stanford University Institute for Human-Centered Artificial Intelligence
View Full Article


Abstracts Copyright © 2022 SmithBucklin, Washington, DC, USA


No entries found