Javascript must be enabled to continue!
Augmented Differential Privacy Framework for Data Analytics
View through CrossRef
Abstract
Differential privacy has emerged as a popular privacy framework for providing privacy preserving noisy query answers based on statistical properties of databases. It guarantees that the distribution of noisy query answers changes very little with the addition or deletion of any tuple. Differential enjoys popular reputation that providing privacy without building any assumptions about the data and protecting against attackers who know all but one record. Differential privacy is a relatively new field of research. Most users have a limited experience in managing differential privacy parameters and achieving a suitable level of privacy without affecting the quality of the analysis. A vast majority of users is still learning how to effectively apply differential privacy in practice.
In this paper, we discussed: on the proposed augmented framework which enables the differential privacy data of any given query, the various differential privacy techniques, metrics for the privacy & utility tradeoff of the data and efficacy of the framework. Discussed state of the art of different differential privacy techniques defined in the framework Laplace, Laplace bounded, Randomized response and Exponential for different data types. The augmented framework consists of three parts one on privacy parameter inputs to control interactively and iteratively on the querying the data , the various differential privacy techniques, the metrics to measure privacy and utility threshold which allows the data analyst to evaluate the accuracy of the privacy safe data for selecting the privacy guaranteed data within the given privacy budget. The framework takes any dataset as input and, generates another dataset which is structurally and statistically very similar original dataset. The newly generated dataset has much stronger privacy guarantee on the selected sensitive and non-sensitive datatypes. We have also demonstrated analytical models developed using the privacy safe data from the framework as substitute to the models developed on the original datasets. We have demonstrated the framework and analytical model with sample data sets to present the similarity of original and differential privacy safe datasets.
Title: Augmented Differential Privacy Framework for Data Analytics
Description:
Abstract
Differential privacy has emerged as a popular privacy framework for providing privacy preserving noisy query answers based on statistical properties of databases.
It guarantees that the distribution of noisy query answers changes very little with the addition or deletion of any tuple.
Differential enjoys popular reputation that providing privacy without building any assumptions about the data and protecting against attackers who know all but one record.
Differential privacy is a relatively new field of research.
Most users have a limited experience in managing differential privacy parameters and achieving a suitable level of privacy without affecting the quality of the analysis.
A vast majority of users is still learning how to effectively apply differential privacy in practice.
In this paper, we discussed: on the proposed augmented framework which enables the differential privacy data of any given query, the various differential privacy techniques, metrics for the privacy & utility tradeoff of the data and efficacy of the framework.
Discussed state of the art of different differential privacy techniques defined in the framework Laplace, Laplace bounded, Randomized response and Exponential for different data types.
The augmented framework consists of three parts one on privacy parameter inputs to control interactively and iteratively on the querying the data , the various differential privacy techniques, the metrics to measure privacy and utility threshold which allows the data analyst to evaluate the accuracy of the privacy safe data for selecting the privacy guaranteed data within the given privacy budget.
The framework takes any dataset as input and, generates another dataset which is structurally and statistically very similar original dataset.
The newly generated dataset has much stronger privacy guarantee on the selected sensitive and non-sensitive datatypes.
We have also demonstrated analytical models developed using the privacy safe data from the framework as substitute to the models developed on the original datasets.
We have demonstrated the framework and analytical model with sample data sets to present the similarity of original and differential privacy safe datasets.
Related Results
Privacy Risk in Recommender Systems
Privacy Risk in Recommender Systems
Nowadays, recommender systems are mostly used in many online applications to filter information and help users in selecting their relevant requirements. It avoids users to become o...
Differential privacy learned index
Differential privacy learned index
Indexes are fundamental components of database management systems, traditionally implemented through structures like B-Tree, Hash, and BitMap indexes. These index structures map ke...
People Analytics
People Analytics
People analytics refers to the systematic and scientific process of applying quantitative or qualitative data analysis methods to derive insights that shape and inform employee-rel...
Enhancing business performance: The role of data-driven analytics in strategic decision-making
Enhancing business performance: The role of data-driven analytics in strategic decision-making
In today’s highly competitive business landscape, organizations are increasingly turning to data-driven analytics to enhance performance and inform strategic decision-making. This ...
THE SECURITY AND PRIVACY MEASURING SYSTEM FOR THE INTERNET OF THINGS DEVICES
THE SECURITY AND PRIVACY MEASURING SYSTEM FOR THE INTERNET OF THINGS DEVICES
The purpose of the article: elimination of the gap in existing need in the set of clear and objective security and privacy metrics for the IoT devices users and manufacturers and a...
Augmented Reality for Smoking Cessation: Development and Usability Study (Preprint)
Augmented Reality for Smoking Cessation: Development and Usability Study (Preprint)
BACKGROUND
The recent widespread availability of augmented reality via smartphone offers an opportunity to translate cue exposure therapy for smoking cessat...
Heterogeneous Differential Privacy
Heterogeneous Differential Privacy
The massive collection of personal data by personalization systems has rendered the preservation of privacy of individuals more and more difficult. Most of the proposed approaches ...
Privacy-Preserving Data Analytics in Internet of Medical Things
Privacy-Preserving Data Analytics in Internet of Medical Things
The healthcare sector has changed dramatically in recent years due to depending more and more on big data to improve patient care, enhance or improve operational effectiveness, and...


