Browsing by Author "Snelling, W.E."
Results Per Page
Sort Options
Item Analysis of Compressive Receivers for the Optimal Interception of Frequency-Hopped Waveforms(1991) Snelling, W.E.; Geraniotis, Evaggelos A.; ISRThis paper establishes that the compressive receiver is a practical interceptor of high performance. Given a signal of a particular duration, a compressive receiver can estimate simultaneously all frequency components within a set wide band. This processing is similar to a parallel bank of narrowband filters, which is the optimal detector of frequency-hopped signals. Furthermore, hop frequency is estimated to yield performance equal to the parallel filter configuration. We assume interference to be stationary, colored Gaussian noise and present a model of the compressive receiver that contains all its salient features. Locally optimal detection is achieved by taking the compressive receiver output as an observation and applying likelihood ratio theory at small signal-to-noise ratios. For small signals, this approach guarantees the largest probability of correct detection for a given probability of false alarm and thus provides a reference, to which simplified or ad hoc schemes can be compared. Since the locally optimal detector has an unwieldy structure, a simplified suboptimal detector structure is developed that consists of simple filter followed by a sampler and a square-envelope detector. Several candidates for the filter's response are presented. The performance of the locally optimal detector based on compressive receiver observations is compared to the optimal filter-bank detector based on direct observations, thus showing the exact loss incurred when a compressive receiver is used. The performance of various simplified schemes based on compressive receiver observations is analyzed and compared with that of the locally optimal detector.Item The Interception of Spread Spectrum Waveforms with the Amplitude Distribution Function(1992) Snelling, W.E.; Geraniotis, Evaggelos A.; ISRWithin the research effort related to unfriendly detection and interception of secure communications, an innovative concept called the Amplitude Distribution Function (ADF) is used to construct a detector that is an enhancement to the radiometer. The ADF is introduced and shown to be roughly the average probability distribution of a random process. The significance of ADF in the is that, under most spreading modulations, e.g. phase and frequency, the ADF is invariant. This suggests that a detector built around the ADF idea would be robust and of general purpose.To develop the ADF methodology, a mathematical foundation is laid consisting of a sequence of definitions, lemmas, and theorems, an outline of which is included in the paper. The most significant result is that the ADF of signal plus noise is the convolution of the ADF of signal and the ADF of noise taken separately. These ideas are applicable through the definition of the Amplitude Moment Statistic (AMS), a statistical transform that converges to the moment generating function of the ADF. Hence, the AMS is the vehicle for indirectly estimating the ADF from observations. For the particular problem of detecting a modulated sinusoid in stationary Gaussian noise, a detector is developed around the AMS. The detector's performance is analyzed, compared with that of a radiometer, and shown superior for small (10) time-bandwidth products.
T
Item Sequential Detection of Unknown Frequency-Hopped Waveforms.(1988) Snelling, W.E.; ISRThe channelized receiver which is optimal for the detection of unknown non-coherent frequency-hopped waveforms bases its decision on a fixed-length block of input data. In this paper we present a sequential method of interception according to which whenever a new data element is collected, a decision is made as to the presence or non-presence of a frequencyhopped waveform. If that decision was indeterminate, another data element is collected. An optimal sequential test is derived, under the assumption that the waveform signal-to-noise ratio (S/N) is known. It is shown that this sequential test requires less data, on average, than the fixed length method to make a decision with the same reliability. Also derived is a truncated sequential method where a decision is forced, if still indeterminate, after some set amount of data is collected. The truncated test is shown to improve the number of samples needed for a decision when the input signal-to-noise ratio defers greatly from that assumed in the derivation of the test. Furthermore it is shown, that the truncated test yields a limited degree of robustness when the input S/N defers slightly from that assumed. A detailed analysis of the performance of these tests is conducted from which a method for finding an optimal truncation point follows. Numerical results which are based on this analysis as well as on simulation of the interceptor's performance are presented to prove the claims made above.