Auburn statistician seeks to create new tools to extract information from privatized data with NSF-funded award
Among constant sophisticated cyberattacks, the topic of data privacy is more relevant than ever.
“As we store larger and larger amounts of data, we increase the risk of exposing them to attacks,” said Roberto Molinari, assistant professor in Auburn University’s Department of Mathematics and Statistics.
He is working to obtain accurate conclusions from data while still maintaining its privacy.
In an area of research derived from computer science, Molinari is working with Jordan Awan, assistant professor of statistics at Purdue University, Stéphane Guerrier, assistant professor in statistics and data science at the Geneva School of Economics and Management at the University of Geneva, and Soren Jordan, associate professor in the College of Liberal Arts at Auburn University.
The award for Simulation-Based Inference for Differential Privacy from the National Science Foundation, Division of Social and Economic Sciences is funded for $450,000 over the next three years.
The overall goal of this research project is to extract more reliable information from data that has what you can call “noise” randomly added for privacy control.
This “noise” allows to respect the definition of “differential privacy” and is added to protect the data. Think of it as an extra layer of security keeping an individual’s information safe.
When you have encryption, once you get the key to access the data, all of it is there. The hackers have the exact information at their fingertips.
However, with differential privacy, the “noise” would keep hackers from being able to identify individuals from the data. The information would be inconclusive regarding their presence in the data.
“This research will develop a strategy and open-source statistical tools to create a safe way to extract the meaningful information from the data while maintaining its privacy,” said Molinari.
As the random error added to the data accumulates, it builds making the data not useful without removing that layer of “noise” that has been added.
“If we can create reliable ways to extract information from privatized data, we can reduce the risks of identity leaks and create opportunities for more data to be safely shared among researchers,” he said.
The real-world applications for this project are everywhere around us, just like the data we produce and store each day.
First and foremost, technology companies could use this software to extract data to learn more about what their users are seeking and customize their online platforms to those needs.
For political sciences, the census data is an essential source for political representation in the United States and is consistently needed for analysis.
For researchers, more data could be shared among collaborators helping to improve research and increase the effectiveness of results.
Molinari is seeking to create these new tools to help extract information that could lead to ground-breaking programs protecting privacy and keeping data safe.
DMS students recognized for their exceptional achievements to be honored at COSAM Honors Convocation in April02/26/2024
DMS Associate Professor Luke Oeding (PI) received $40K from the NSF to organize the Conference on Tensor Invariants in Geometry and Complexity Theory02/26/2024
Evolutionary ecologist works on 15-year international research collaboration with insight into the impact of density-dependence on populations02/20/2024