• Conference
  • Engineering and Numerical Tools

Conférence : Communications avec actes dans un congrès international

The ubiquity of sensors-equipped mobile devices
has enabled citizens to contribute data via participatory sensing
systems. This emergent paradigm comes with various applications to improve users’ quality of life. However, the data collection process may compromise the participants’ privacy when reporting data tagged or correlated with their sensitive information.
Therefore, anonymization and location cloaking techniques have
been designed to provide privacy protection, yet to some cost of
data utility which is a major concern for queriers. Different from
past works, we assess simultaneously the two competing goals of
ensuring the queriers’ required data utility and protecting the
participants’ privacy. First, we introduce a trust worthy entity to
the participatory sensing traditional system. Also, we propose a
general privacy-preserving mechanism that runs on this entity to
release a distorted version of the sensed data in order to minimize the information leakage with its associated private information.
We demonstrate how to identify a near-optimal solution to the
privacy-utility tradeoff by maximizing a privacy score while considering a utility metric set by data queriers (service providers).
Furthermore, we tackle the challenge of data with large size
alphabets by investigating quantization techniques. Finally, we
evaluate the proposed model on three different real datasets
while varying the prior knowledge and the obfuscation type.
The obtained results demonstrate that, for different applications,
a limited distortion may ensure the participants’ privacy while
maintaining about 98% of the required data utility.