menu
Project

Providentia: Privacy-driven Trust in Algorithms

Rasmus Pagh, University of Copenhagen
Grant amount: DKK 8,410,068

How do we ensure that we can trust systems that use data to make decisions? Lawmakers across the world are grappling with the question of how to properly regulate systems that collect, analyze, and use data. Getting the balance right is crucial. Too little regulation could increase the risk of compromising basic societal values and privacy. Too strict regulations will limit our possibilities for realizing the value and societal benefit of AI and big data analytics.

The Providentia project will advance algorithms for integrating and analyzing sensitive data – such as health data or medical records – in a secure way that preserves privacy and does not require all data to be transferred to a central location.

During the last decade, differential privacy has emerged as the gold standard for protecting private information, based on firm mathematical guarantees on how much private information can be deduced from data sets, analyses, or predictions that are released. Extending recent developments, the project will establish a research group focusing on differentially private algorithms in distributed settings. The goal is to enable data science when no entity holds all relevant data, and where privacy considerations make it impossible or undesirable to consolidate all data for central analysis. This is relevant, for instance, when using healthcare data to improve health outcomes via better prevention, diagnosis and treatment of disease.

The project name is inspired by the ancient Roman goddess Providentia who personifies the ability to foresee events and make suitable provision. In this spirit, the Providentia project seeks to provide the forethought needed to ensure that data scientists can make use of valuable sources of insight, also when data contains sensitive information.

Project participants
Rasmus Pagh, Professor
University of Copenhagen, Department of Computer Science