Correlation and cooperation in network information theory

Loading...
Thumbnail Image

Files

umi-umd-5633.pdf (1.11 MB)
No. of downloads: 458

Publication or External Link

Date

2008-08-01

Citation

DRUM DOI

Abstract

Recent developments in wireless ad-hoc and sensor networking motivates the investigation of sophisticated phenomena that arise in such networks from an information theoretic point of view. In this dissertation, we focus on two of these phenomena: correlation and cooperation. In wireless networks, correlation mainly originates from the correlated observations of different users, while cooperation is enabled by the wireless medium, which lets third-party users obtain part of the information from the transmitter in order to help deliver it to the destination.

We first study the effects of source correlation in multi-user networks. More specifically, we study the distributed source and channel coding problem for correlated sources, e.g.,~multiple access channel with correlated sources and multi-terminal rate-distortion problem. In these problems, it is often needed to characterize the joint probability distribution of a pair of random variables satisfying an $n$-letter Markov chain. An exact characterization of such probability distributions is intractable. We propose a new data processing inequality, which provides us a single-letter necessary condition for the $n$-letter Markov chain. Our new data processing inequality yields outer bounds for the multiple access channel with correlated sources and the multi-terminal rate-distortion region.

Next, we investigate the role of correlation in cooperative multi-user networks. We consider the basic three-node relay channel, which is the simplest model for cooperative communications. We propose a new coding scheme for the relay channel, which is in the form of block Markov coding and is based on preserving the correlation in the channel inputs from the transmitter and the relay. The analysis of the error events provides us with three conditions containing mutual information expressions involving infinite letters of the underlying random process. We lower bound these mutual informations to obtain three single-letter conditions. We show that the achievable rates with the classical compress-and-forward (CAF) scheme is a special case of the achievable rates in our new coding scheme. We therefore conclude that our proposed coding scheme yields potentially larger rates than the CAF scheme.

Finally, we focus on the diamond channel, which is a four-node cooperative communication network. We study a special class of diamond channels, which consists of a transmitter, a noisy relay and a noiseless relay, and a destination. We determine the capacity of this class of diamond channels by providing an achievable scheme and a converse. The capacity we show is strictly smaller than the cut-set bound. Our result also shows the optimality of a combination of decode-and-forward (DAF) and CAF at the noisy relay node. This is the first example where a combination of DAF and CAF is shown to be capacity achieving. We also uncover a duality between this diamond channel coding problem and the Kaspi-Berger source coding problem.

Notes

Rights