Rosenblum, MarkYacoob, YaserDavis, Larry S.(Also cross-referenced as CAR-TR-721) In this paper a radial basis function network architecture is developed that learns the correlation between facial feature motion patterns and human emotions. We describe a hierarchical approach which at the highest level identifies emotions, at the mid level determines motions of facial features, and at the low level recovers motion directions. Individual emotion networks were trained to recognize the 'smile" and "surprise" emotions. Each network was trained by viewing a set of sequences of one emotion for many subjects. The trained neural network was then tested for retention, extrapolation and rejection ability. Success rates were about 88% for retention, 73Wo for extrapolation, and 79% for rejection.en-USHuman Emotion Recognition from Motion Using a Radial Basis Function Network ArchitectureTechnical Report