The Experts below are selected from a list of 726159 Experts worldwide ranked by ideXlab platform
Kamalika Chaudhuri - One of the best experts on this subject based on the ideXlab platform.
-
SIGMOD Conference - Pufferfish Privacy Mechanisms for Correlated Data
Proceedings of the 2017 ACM International Conference on Management of Data, 2017Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
-
Privacy-preserving Analysis of Correlated Data.
arXiv: Learning, 2016Co-Authors: Yizhen Wang, Shuang Song, Kamalika ChaudhuriAbstract:Many modern machine learning applications involve sensitive Correlated Data, such private information on users connected together in a social network, and measurements of physical activity of a single user across time. However, the current standard of privacy in machine learning, differential privacy, cannot adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish to Correlated Data problems is the lack of suitable mechanisms. In this paper, we provide a general mechanism, called the Wasserstein Mechanism, which applies to any Pufferfish framework. Since the Wasserstein Mechanism may be computationally inefficient, we provide an additional mechanism, called Markov Quilt Mechanism, that applies to some practical cases such as physical activity measurements across time, and is computationally efficient.
-
Pufferfish Privacy Mechanisms for Correlated Data
arXiv: Learning, 2016Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
Shuang Song - One of the best experts on this subject based on the ideXlab platform.
-
SIGMOD Conference - Pufferfish Privacy Mechanisms for Correlated Data
Proceedings of the 2017 ACM International Conference on Management of Data, 2017Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
-
Privacy-preserving Analysis of Correlated Data.
arXiv: Learning, 2016Co-Authors: Yizhen Wang, Shuang Song, Kamalika ChaudhuriAbstract:Many modern machine learning applications involve sensitive Correlated Data, such private information on users connected together in a social network, and measurements of physical activity of a single user across time. However, the current standard of privacy in machine learning, differential privacy, cannot adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish to Correlated Data problems is the lack of suitable mechanisms. In this paper, we provide a general mechanism, called the Wasserstein Mechanism, which applies to any Pufferfish framework. Since the Wasserstein Mechanism may be computationally inefficient, we provide an additional mechanism, called Markov Quilt Mechanism, that applies to some practical cases such as physical activity measurements across time, and is computationally efficient.
-
Pufferfish Privacy Mechanisms for Correlated Data
arXiv: Learning, 2016Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
Yizhen Wang - One of the best experts on this subject based on the ideXlab platform.
-
SIGMOD Conference - Pufferfish Privacy Mechanisms for Correlated Data
Proceedings of the 2017 ACM International Conference on Management of Data, 2017Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
-
Privacy-preserving Analysis of Correlated Data.
arXiv: Learning, 2016Co-Authors: Yizhen Wang, Shuang Song, Kamalika ChaudhuriAbstract:Many modern machine learning applications involve sensitive Correlated Data, such private information on users connected together in a social network, and measurements of physical activity of a single user across time. However, the current standard of privacy in machine learning, differential privacy, cannot adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish to Correlated Data problems is the lack of suitable mechanisms. In this paper, we provide a general mechanism, called the Wasserstein Mechanism, which applies to any Pufferfish framework. Since the Wasserstein Mechanism may be computationally inefficient, we provide an additional mechanism, called Markov Quilt Mechanism, that applies to some practical cases such as physical activity measurements across time, and is computationally efficient.
-
Pufferfish Privacy Mechanisms for Correlated Data
arXiv: Learning, 2016Co-Authors: Shuang Song, Yizhen Wang, Kamalika ChaudhuriAbstract:Many modern Databases include personal and sensitive Correlated Data, such as private information on users connected together in a social network, and measurements of physical activity of single subjects across time. However, differential privacy, the current gold standard in Data privacy, does not adequately address privacy issues in this kind of Data. This work looks at a recent generalization of differential privacy, called Pufferfish, that can be used to address privacy in Correlated Data. The main challenge in applying Pufferfish is a lack of suitable mechanisms. We provide the first mechanism -- the Wasserstein Mechanism -- which applies to any general Pufferfish framework. Since this mechanism may be computationally inefficient, we provide an additional mechanism that applies to some practical cases such as physical activity measurements across time, and is computationally efficient. Our experimental evaluations indicate that this mechanism provides privacy and utility for synthetic as well as real Data in two separate domains.
Min Wang - One of the best experts on this subject based on the ideXlab platform.
-
VTC Fall - Joint Opportunistic Network Coding and Opportunistic Routing for Correlated Data Gathering in Wireless Sensor Network
2013 IEEE 78th Vehicular Technology Conference (VTC Fall), 2013Co-Authors: Chong Tan, Junni Zou, Min WangAbstract:In this paper, we study the problem of Correlated Data gathering in wireless sensor networks. An opportunistic routing protocol joint with opportunistic network coding is proposed for Correlated Data gathering. In order to reduce the total transmission, we introduce the expected number of coded transmission (ECTX) as metric for the routing selection. Moreover, we also define the coding gain for the network nodes to count the coding opportunistic, with which the throughput of the wireless sensor network is further increased. Numerical simulations are conducted for performance analysis with the proposed routing scheme.
Patrick J. Heagerty - One of the best experts on this subject based on the ideXlab platform.
-
Weighted empirical adaptive variance estimators for Correlated Data regression
Journal of the Royal Statistical Society: Series B (Statistical Methodology), 1999Co-Authors: Thomas Lumley, Patrick J. HeagertyAbstract:Estimating equations based on marginal generalized linear models are useful for regression modelling of Correlated Data, but inference and testing require reliable estimates of standard errors. We introduce a class of variance estimators based on the weighted empirical variance of the estimating functions and show that an adaptive choice of weights allows reliable estimation both asymptotically and by simulation in finite samples. Connections with previous bootstrap and jackknife methods are explored. The effect of reliable variance estimation is illustrated in Data on health effects of air pollution in King County, Washington.