The Performance Limit for Distributed Bayesian Estimation with Identical One-Bit Quantizers
Document Type
Conference Proceeding
Publication Date
2015
DOI
http://dx.doi.org/10.1109/DSP-SPE.2015.7369521
Abstract
In this paper, a performance limit is derived for a distributed Bayesian parameter estimation problem in sensor networks where the prior probability density function of the parameter is known. The sensor observations are assumed conditionally independent and identically distributed given the parameter to be estimated, and the sensors employ independent and identical quantizers. The performance limit is established in terms of the best possible asymptotic performance that a distributed estimation scheme can achieve for all possible sensor observation models. This performance limit is obtained by deriving the optimal probabilistic quantizer under the ideal setting, where the sensors observe the parameter directly without any noise or distortion. With a uniform prior, the derived Bayesian performance limit and the associated quantizer are the same as the previous developed performance limit and quantizers under the minimax framework, where the parameter is assumed to be fixed but unknown. This proposed performance limit under distributed Bayesian setting is compared against a widely used performance bound that is based on full-precision sensor observations. This comparison shows that the performance limit derived in this paper is comparatively much tighter in most meaningful signal to-noise ratio (SNR) regions. Moreover, unlike the unquantized observations performance limit which can never be achieved, this performance limit can be achieved under certain noise observation models.
Publication Information
Li, Xia; Guo, Jun; Chen, Hao; and Rogers, Uri. (2015). "The Performance Limit for Distributed Bayesian Estimation with Identical One-Bit Quantizers". 2015 IEEE Signal Processing and Signal Processing Education Workshop (SP/SPE), 19-24.