Institute for Systems Research
Permanent URI for this communityhttp://hdl.handle.net/1903/4375
Browse
3 results
Search Results
Item Distributed Hypothesis Testing with Data Compression(1991) Shalaby, H.M.H.; Papamarcou, A.; ISRWe evaluate the performance of several multiterminal detection systems, each of which comprises a central detector and a network of remote sensors. The function of the sensors is to collect data on a random signal source and process this information for transmission to the central detector. Transmission is via noiseless channels of limited capacity, hence data compression is necessary for each sensor. Upon receipt of the transmitted information, the central detector seeks to determine whether the true distribution governing the signal source belongs to a null class II or an alternative class X. System optimization is effected under the classical criterion that stipulates minimization of the type II error rate subject to an upper bound e on the type I error rate. We consider the asymptotic performance 䟭easured by an appropriate error exponents 䟯f five types of systems. The first type has a fixed number of sensors, and processes spatially dependent but temporally independent data of growing sample size in time. Data compression for this type is at rate that tends to zero, and distribution classes II and X each consist of a single element. The second type of system is identical to the first, except for the classes II and X, which are composite The third type of system is a variant of the first which employs fixed-rate data compression. The fourth type is altogether different, in that it employs a variable number of sensors handling independent data of fixed sample size, and intersensor communication is effected by two distinct feedback schemes. The fifth types of system is yet another variant of the first in which data exhibit Markovian dependence in time and are compressed by fixed bit quantizers. In the majority of cases we obtain concise characterizations of the associated error exponents using information-theorectic tools.Item Error exponents for Distributed Detection of Markov Sources(1991) Shalaby, H.M.H.; Papamarcou, A.; ISRWe consider a decentralized detection problem in which two sensors collect data from a discrete-time finite-valued stationary ergodic Markov source and transmit M-ary messages to a Neyman-Pearson central detector. We assume that the codebook sizes M are fixed for both sensors and to not vary with data sample size. We investigate the asymptotic behavior of the type II error rate as the sample size increases to infinity and obtain (under mild assumptions on the source distributions) the associated error exponent. The derived exponent is independent of the test level e and the codebook sizes M, is achieved by a universally optimal sequence of acceptance regions and is characterized by an infimum of informational divergence over a class of infinite-dimensional distributions.Item Distributed Detection with Feedback(1990) Shalaby, H.M.H.; Papamarcou, A.; ISRWe investigated the effects of feedback on a decentralized detection system consisting of N sensors and a data fusion center. It is assumed that observations are independent and identically distributed across sensors, and that each sensor uses a randomized scheme for compressing its observations into a fixed number of quantization levels. We consider two variations on this setup. One entails the transmission of sensor data to the fusion center in two stages, and the broadcast of feedback information from the center to the sensors after the first stage. The other variation involves information exchange between sensors prior to transmission to the fusion center; this exchange is effected through a feedback decision center, which processes binary data from the sensors and thereafter broadcasts a single feedback bit back to the sensors. We show that under the Neyman- Pearson criterion, only the latter type of feedback yields an improvement on the asymptotic performance of the system (as N ƀ ), and we derive the associated error exponents. We also demonstrate that deterministic compression schemes are asymptotically as powerful as randomized ones