Perception of human interaction based on motion trajectories: from aerial videos to decontextualized animations

Image credit: Topics in Cognitive Science

Abstract

People are adept at perceiving interactions from movements of simple shapes, but the underlying mechanism remains unknown. Previous studies have often used object movements defined by experimenters. The present study used aerial videos recorded by drones in a real-life environment to generate decontextualized motion stimuli. Motion trajectories of displayed elements were the only visual input. We measured human judgments of interactiveness between two moving elements and the dynamic change in such judgments over time. A hierarchical model was developed to account for human performance in this task. The model represents interactivity using latent variables and learns the distribution of critical movement features that signal potential interactivity. The model provides a good fit to human judgments and can also be generalized to the original Heider–Simmel animations (1944). The model can also synthesize decontextualized animations with a controlled degree of interactiveness, providing a viable tool for studying animacy and social perception.

Publication
Topics in Cognitive Science, 10(1), 225-241
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Create your slides in Markdown - click the Slides button to check out the example.

Supplementary notes can be added here, including code, math, and images.

Yujia Peng
Yujia Peng
Assistant Professor of Psychology

Yujia Peng is an assistant professor at the School of Psychological and Cognitive Sciences, Peking University.