The Next Frontier in Data Privacy

*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************

In a digital age where data is produced and collected by the second, Cummings searches for a place for privacy

Contact

Georgia Parmelee

College of Engineering

Sidebar Content
No sidebar content submitted.
Summaries

Summary Sentence:

Rachel Cummings, an assistant professor at Georgia Tech’s Stewart School of Industrial and Systems Engineering, is working to better understand data privacy and how it relates to both human behavior and the economy.

Full Summary:

Rachel Cummings, an assistant ISyE professor, is working to better understand data privacy and how it relates to both human behavior and the economy.

Media
  • Rachel Cummings Rachel Cummings
    (image/jpeg)

Last month, Facebook and Cambridge Analytica brought data privacy into the spotlight. The UK-based data firm acquired millions of Facebook users’ personal data to build software that could target swing voters during political campaigns. Essentially, Facebook data was leveraged to create targeted ads for political gain, leading to questions around the legality and moral state of data privacy.  

Data privacy can be a grey area for thousands of companies that use online behavioral data to target consumers every day – from ads on the websites we visit to the coupons we get at the grocery store. These choices are tracked, collected and analyzed en masse. It can help consumers: you watched a certain movie on Netflix, so it suggests another you might like in the same genre. But it can also be intrusive, creating a feeling of ‘big brother’: your running route was recorded by a workout app and shared with others.

Rachel Cummings, an assistant professor at Georgia Tech’s H. Milton Stewart School of Industrial and Systems Engineering, is working to better understand data privacy and how it relates to both human behavior and the economy.

“The issue with Facebook and Cambridge Analytica highlights that data privacy is a highly nuanced issue,” said Cummings. “Unlike traditional data breaches, these two companies were legally sharing data according to an agreed upon contract. The issue in this case is downstream data use: once a person shares their data, who is allowed to use it and for what purposes?” 

Cummings recognizes that challenges exist for companies that want to use their data – they stand to gain valuable insights from it but are hesitant for fear of bad press. So, to effectively capitalize on data in a non-intrusive way, differential privacy can help, which is Cummings’ area of focus.  

Cummings works within the field of differential privacy – a type of database privacy that guarantees the input data from a single individual (your home address, for example) has a very small impact on the output of a computation (Zillow reporting how many people live in a neighborhood, for instance). The goal of differential privacy is to ensure you only learn from the global database aggregate, rather than any specific individual.

Cummings’ lab develops and optimizes the algorithms that support differential privacy for corporations like Google, where data can be turned into dollars. Google uses it when a Chrome web browser crashes. To identify the problem without exposing the search history of users, a differential privacy algorithm is used to strip out personal identifiable user information. It’s designed to protect the privacy of individuals, but still provide Google with helpful information to make their browser service better.

Other Fortune 500 companies, like Apple, are leading the way in piloting differential privacy to make better business decision based on their data. Companies can better understand customer’s preferences, explain why they made the choices they did, and predict their future behavior. The algorithms can also be applied to healthcare and medical records to determine patterns in diseases or discern treatments that work on specific demographics, without violating medical privacy laws.

“With great data comes the potential for great privacy violations,” said Cummings. “As companies make more efficient use of personal data, they must also respect the privacy needs of the individuals who shared their data.  I’m hoping to revolutionize my field, as well as U.S. business practices, by redesigning privacy policies so individuals have some say over how companies use the data they create.”

Cummings plans to continue her work at the intersection of economics, machine learning and data privacy. She proposes that companies need to think about how to incentivize people to share their data, while still giving them privacy guarantees. By striking the right balance between protecting consumer privacy and monetizing data, companies will be able to leverage differential privacy to their advantage.

Additional Information

Groups

School of Industrial and Systems Engineering (ISYE)

Categories
No categories were selected.
Related Core Research Areas
No core research areas were selected.
Newsroom Topics
No newsroom topics were selected.
Keywords
No keywords were submitted.
Status
  • Created By: Shelley Wunder-Smith
  • Workflow Status: Published
  • Created On: Apr 17, 2018 - 9:46am
  • Last Updated: Apr 17, 2018 - 12:12pm