Adam O’Neill joined the College of Information and Computer Sciences as an assistant professor in the spring of 2019. His research focuses on cryptography, in particular the problem of secure outsourced databases—which have the ability to share proprietary data without compromising privacy. Let’s say that a pharmaceutical company wanted to run a study on DNA sequences without violating the privacy of participants. A secure outsourced database would allow individual DNA sequences to be encrypted before sharing, thus protecting privacy, but still allowing studies to be run on the data. O’Neill explains, “I want people who are using other people’s data to have as little information about the data as possible, so that they can only get the information that the data owners want them to have, and nothing else.”
In addition, O’Neill and Center for Data Science postdoctoral fellow Mukul Kulkarni are working on applying encryption to federated machine learning. In a federated setting, several machine learning models are trained locally and then sent to a central server to combine into a single aggregated model. Currently, the privacy of the raw data is not protected in federated settings, because the local models can be reverse engineered from the aggregated model. O’Neill’s research focuses on a special type of encryption that would allow participants to encrypt their model before sending it to the central server, such that the server can use the model without having access to the data. For example, a local real estate company could create a model predicting a property’s market value based on square footage. This model could be created locally in many different markets, then sent to a central server to aggregate into one model, without revealing the raw data from each market.
Both secure outsourced databases and encrypted federated machine learning rely on a new technology introduced by O’Neill through his research—function revealing encryption. Function revealing encryption allows users to pull certain functions out of encrypted data, such as the ability to track a disease globally while keeping sensitive medical data encrypted and safe.
O’Neill explains his interest in engaging with industry partners to discuss their particular needs for data security and how his research might be applied, “One of the things I would like to do is work with industry to develop provably secure ways to do federated learning so that they can protect their data while still getting use out of it.”