Movement Visualizer & Nonverbal Synchrony

News: Our poster has been accepted to IEEE VR!

Shaikh, O., Sun, Y., & Won, A. S. (Accepted). Movement visualizer for networked virtual reality platforms. IEEE VR. 

Motivation for Movement Tracking in VR

  • Movement tracking is a more accurate measure for people's nonverbal behavior than coding people's movement from videos.
  • The movement data reflects people's unconscious or subconscious responses in virtual reality, sometimes more objective than self-report surveys.
  • Movement data reveals emotion and state of an individual and can also inform researchers and designers to make UX design decisions when combining ergonomics measures.

Movement Visualizer Interface Design

User Interaction Sketches:

new doc 2018-02-16 13.56.01_1.jpg
new doc 2018-02-16 13.56.01_3.jpg
new doc 2018-02-16 13.56.01_4.jpg
new doc 2018-02-16 13.56.01_2.jpg

      The interface of the Movement Visualizer is shown in Figure 1 on the left below. It displays the live summary movement of two participants. Figure 2 below on the right shows a scene where the tracker is projected in the virtual world on the High Fidelity platform. Their movements are recorded and demonstrated live on the tracker.

        Figure 1: A screenshot of the movement tracker

        Figure 1: A screenshot of the movement tracker

         Figure 2: A screenshot of two generic avatars on the High Fidelity platform with the movement tracker in the virtual world

         Figure 2: A screenshot of two generic avatars on the High Fidelity platform with the movement tracker in the virtual world

Nonverbal Synchrony in a Networked Virtual Environment

      One application of the movement tracker is to look at the nonverbal synchrony in virtual reality. As participants need to wear a head-mounted display (HMD) and use two controllers, their movement data are detected and recorded by the devices. Given access to the head and hand data, we are able to quantify the movement and understand the meanings behind the data. 

      Additionally, we can also manipulate the avatar's appearance to see how it makes a difference in people's responses to their performance in virtual reality.

Since this is an ongoing project, we are running data analysis and drafting the second paper now. Feel free to contact me for more details.