Skip to main content

Table 3 Advantages and drawbacks of DP mechanisms

From: Differential privacy: its technological prescriptive using big data

Mechanism

Advantages

Drawbacks

PINQ [18]

First platform providing differential privacy guarantees

Expands the set of capable users of sensitive data, increases the portability of privacy-preserving algorithms across data sets and domains, and broadens the scope of the analysis of sensitive data

Does not consider the application developer to be an adversary

Subjected to a weaker privacy constraint. Hence, vulnerable to state attack, privacy budget attack and timing attacks

It further requires the developers to rewrite the application to make use of the PINQ primitives

Airavat [22]

First system that integrates mandatory access control with differential privacy, enabling many privacy-preserving MapReduce computations without the need to audit untrusted code

Can be deployed in large scale distribution, without the need of rewriting existing MapReduce applications

Cannot confine every computation performed by untrusted code. Only considers the map program to be an “untrusted” computation while the reduce program is “trusted” to be implemented in a differentially private manner

Supports only limited Reducer functions

Vulnerable to state attack and timing attacks

GUPT [26]

Uses the aging model of data sensitivity, to allow analysts to describe the abstract ‘privacy budget’ in terms of expected accuracy of the final output

GUPT automatically allocates a privacy budget to each query in order to match the data analysts’ accuracy requirements

Defends against side channel attacks such as the privacy budget attacks, state attacks and timing attacks

GUPT assumes that the output dimensions are known in advance. This may however not always be true in practice

Inherits limitations of differential privacy regarding splitting og privacy budget

Geo-indistinguishability [25]

Proposes a generalized notion of differential privacy instantiated with the Euclidean metric which can be naturally applied to location privacy

Offers the best privacy guarantees for the same utility, among all those which do not depend on the prior knowledge of the adversary, i.e., the mechanism is designed once and for all and it is applicable also when we do not know the prior

Linear degradation of the user’s privacy that limits use of the mechanism over time

The level of noise of the Laplacian mechanism has to be fixed in advance independently of the movements of the user

Despite achieving the flexible behavior, the tiled mechanism would not satisfy geo-indistinguishability to its full potential

Telco big data [12]

First attempt to implement three basic DP architectures in the deployed telecommunication (telco) big data platform for data mining applications

Proposed with the observation that the accuracy loss increases by increasing the variety of features, but decreases by increasing the volume of training data

The privacy of people in the training data is protected, but the privacy of people in the prediction data (that is, the data which will be applied to the trained model to) is not

Design of adjustable privacy budget assignment strategies is required for better accuracy along with privacy guarantee

e-Health data release [21]

Improves the performance of the previous work by designing a new private partition algorithm of histogram and also proposing a heuristic hierarchical query method

Real experiments were conducted and the schemes compared with the existing one to show that the proposal is more efficient in terms of data processing and updating

Increase of the accuracy of data release through consistency and gives a proof of privacy to show that the proposed algorithm is under differential privacy

Data release issues under differential privacy, such as real time monitoring and publishing of e-health data is proposed