Institute for Systems Research Technical Reports
Permanent URI for this collectionhttp://hdl.handle.net/1903/4376
This archive contains a collection of reports generated by the faculty and students of the Institute for Systems Research (ISR), a permanent, interdisciplinary research unit in the A. James Clark School of Engineering at the University of Maryland. ISR-based projects are conducted through partnerships with industry and government, bringing together faculty and students from multiple academic departments and colleges across the university.
Browse
Search Results
Item Target Discrimination with Neural Networks(1995) Lin, Daw-Tung; Dayhoff, Judith E.; Resch, C.L.; ISRThe feasibility of distinguishing multiple type components of exo-atmospheric targets is demonstrated by applying the Time Delay Neural Network (TDNN) and the Adaptive Time-Delay Neural Network (ATNN). Exo-atmospheric targets are especially difficult to distinguish using currently available techniques because all target parts follow the same spatial trajectory. Thus classification must be based on light sensors that record signal over time. Results have demonstrated that the trained neural networks were able to successfully identify warheads from other missile parts on a variety of simulated scenarios, including differing angles and tumbling. The network with adaptive time delays (the ATNN) performs highly complex mapping on a limited set of training data and achieves better generalization to overall trends of situations compared to the TDNN, which includes time delays but adapts only its weights. The ATNN was trained on additive noisy data and it is shown that the ATNN possesses robustness to environment variations.Item Sampling Effects on Trajectory Learning and Production(1995) Lin, Daw-Tung; Dayhoff, Judith E.; ISRThe time-delay neural network (TDNN) and the adaptive time-delay neural network (ATNN) are effective tools for signal production and trajectory generation. Previous studies have shown production of circular and figure-eight trajectories to be robust after training. We show here the effects of different sampling rates on the production of trajectories by the ATNN neural network, including the influence of sampling rate on the robustness and noise-resilience of the resulting system. Although fast training occurred with few samples per trajectory, and the trajectory was learned successfully, more resilience to noise was observed when there were higher numbers of samples per trajectory. The effects of changing the initial segments that begin the trajectory generation were evaluated, and a minimum length of initial segment is required but the location of that segment does not influence the trajectory generation, even when different initial segments are used during training and recall. A major conclusion from these results is that the network learns the inherent features of the trajectory rather than memorizing each point. When a recurrent loop was added from the output to the input of the ATNN, the the training was shown to result in an attractor of the network for a figure-eight trajectory, which involves more complexity due to crossover compared with previous attractor training of a circular trajectory. Furthermore, when the trajectory length was not a multiple of the sampling interval, the trained network generated intervening points on subsequent repetitions of the trajectory, a feature of limit cycle attractors observed in dynamic networks. Thus an effective method of training an individual dynamic attractor into a neural network is extended to more complex trajectories and to show the properties of a limit cycle attractor.Item Network Unfolding Algorithm and Universal Spatiotemporal Function Approximation(1995) Lin, Daw-Tung; Dayhoff, Judith E.; ISRIt has previously been known that a feed-forward network with time-delay can be unfolded into a conventional feed-forward network with a time history as input. In this paper, We show explicitly how this unfolding operation can occur, with a newly defined Network Unfolding Algorithm (NUA) that involves creation of virtual units and moving all time delays to a preprocessing stage consisting of the time histories. The NUA provides a tool for analyzing the complexity of the ATNN. From this tool, we concluded that the ATNN reduces the cost of network complexity by at least a factor of O(n) compared to an unfolded Backpropagation net. We then applied the theorem of Funahashi, Hornik et al and Stone-Weierstrass to state the general function approximation ability of the ATNN. We furthermore show a lemma (Lemma 1) that the adaptation of time-delays is mathematically equivalent to the adjustment of interconnections on a unfolded feed-forward network provided there are a large enough number (h2nd) of hidden units. Since this number of hidden units is often impractically large, we can conclude that the TDNN and ATNN are thus more powerful than BP with a time history.