Publication | ACM SIGCHI Conference on Human Factors in Computing Systems 2022
AvatAR
An Immersive Analysis Environment for Human Motion Data Combining Interactive 3D Avatars and Trajectories
This paper presents AvatAR, an augmented reality (AR) environment for analyzing recordings of human motion data to gain insights into how humans utilize and interact with their surrounding environment. Using avatars, trajectories, and environmental visualizations (footprints, touchpoints, etc.), we make data not only explorable, but experienceable.
Download publicationAbstract
AvatAR: An Immersive Analysis Environment for Human Motion Data Combining Interactive 3D Avatars and Trajectories
Patrick Reipschläger, Frederik Brudy, Raimund Dachselt, Justin Matejka, George Fitzmaurice, Fraser Anderson
ACM SIGCHI Conference on Human Factors in Computing Systems 2022
Analysis of human motion data can reveal valuable insights about the utilization of space and interaction of humans with their environment. To support this, we present AvatAR, an immersive analysis environment for the in-situ visualization of human motion data, that combines 3D trajectories with virtual avatars showing people’s detailed movement and posture. Additionally, we describe how visualizations can be embedded directly into the environment, showing what a person looked at or what surfaces they touched, and how the avatar’s body parts can be used to access and manipulate those visualizations. AvatAR combines an AR HMD with a tablet to provide both mid-air and touch interaction for system control, as well as an additional overview device to help users navigate the environment.We implemented a prototype and present several scenarios to show that AvatAR can enhance the analysis of human motion data by making data not only explorable, but experienceable
AvatAR is an augmented reality environment for analyzing recordings of human motion data, that serves to gain insights into how human utilize and interact with their surrounding environment. We combine head-mounted Augmented Reality with a tablet device to display and manipulate in-situ visualizations of time-varying motion data directly in the environment in which is was recorded.
Associated Researchers
Patrick Reipschläger
Technische Universität Dresden
Raimund Dachselt
Technische Universität Dresden
Related Resources
2025
Connect with our Research Connections: Data Storytelling in Augmented Reality Video & Spatial ComputingHear from University of Waterloo Professor Matt Brehmer as he shares…
2024
Exploring Opportunities for Adopting Generative AI in Automotive Conceptual DesignThis research discusses opportunities for adopting generative AI in…
2021
MeetingMate: an Ambient Interface for Improved Meeting Effectiveness and Corporate Knowledge SharingWe present MeetingMate, a system for improving meeting effectiveness…
2018
Dream Lens: Exploration and Visualization of Large-Scale Generative Design DatasetsThis paper presents Dream Lens, an interactive visual analysis tool…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us