A Language for Human Action

View/ Open
Date
2006-10-15Author
Guerra-Filho, Gutemberg
Aloimonos, Yiannis
Metadata
Show full item recordAbstract
Human-centered computing (HCC) is centered on humans and what they do,
i.e. human actions. Thus, developing an infrastructure for HCC requires
understanding human action, at some level of detail. We need to be able to
talk about actions, synthesize actions, recognize actions, manipulate
actions, imitate actions, imagine and predict actions. How could we
achieve this in a principled fashion? This paper proposes that the space
of human actions has a linguistic structure. This is a sensory-motor space
consisting of the evolution of the joint angles of the human body in
movement. The space of human activity has its own phonemes, morphemes, and
sentences. We present a Human Activity Language (HAL) for symbolic
non-arbitrary representation of visual and motor information. In
phonology, we define atomic segments (kinetemes) that are used to compose
human activity. In morphology, we propose parallel learning to incorporate
associative learning into a language inference approach. Parallel learning
solves the problem of overgeneralization and is effective in identifying
the active joints and motion patterns in a particular action. In syntax,
we point out some of the basic constraints for sentence formation.
Finally, we demonstrate this linguistic framework on a praxicon of 200
human actions (motion capture data obtained by a suit) and we discuss the
implications of HAL on HCC.