News (9/16/2019): “Nonverbal Synchrony in Virtual Reality” Published in PLOS ONE!

Sun, Y., Shaikh, O., & Won, A. S. (2019). Nonverbal synchrony in virtual reality. PloS one, 14(9), e0221803.

News (8/15/2019): Presenting at HFES 2019!

Sun, Y., Kar, G., Won, A., & Hedge, A. (in press). Postural risks and user experience of 3D interface designs for virtual reality-based learning environments.

News (10/18/2018): Paper accepted to HFES 2018!

Kar, G., Sun, Y., Celikors, E. et al. (2018). Effects of a dynamic foot movement device on cognitive performance in short-duration computer-based tasks. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1): 460-464.

News (3/17/2018): Paper accepted to International Society for Presence Research (ISPR)!

Sun, Y., Pandita, S., Shaikh, O., Kim, B. & Won, A. S. (2018). Personalized avatars and self-presence. In Proceedings of the International Society for Presence Research Annual Conference, May 21-22, Prague, Czech Republic.

News (2/15/2018): Poster accepted to IEEE VR!

Shaikh, O., Sun, Y., & Won, A. S. (2018). Movement visualizer for networked virtual reality platforms. IEEE VR. Read Paper

UX Design Lead: Learning Moon Phases in Virtual Reality

My Role (Aug. 2017 - March 2018):

  • Designed and prototyped three interactive headset mockups of the 3D quiz question display in Unity

  • Developed the key performance and engagement measures (movement & interaction tracking, posture analysis & ergonomic measures)

  • Conducted the usability test with 41 participants

**Paper in Press**

Graduate Researcher: Team Interactions in VR


My Role (Aug. 2016 - March 2018): 

  • Stage 1: Redesigned the user interface and interaction of the movement visualizer

  • Stage 2: Studied the influences of different levels of the avatar customization on team performances using the visualizer with 96 pairs of participants


3D Generalist: 3D Modeling & Avatar Customization


My Role (June 2017 - Aug. 2017): 

  • Built 3D models from real scenes using 3ds Max

  • Created avatars with different levels of customization using FaceGen, Daz Studio, Fuse CC and Mixamo


UX Designer: BUMP -- Interactive Voice-based Technology

Team: Four Information Science students

My Role (Jan. 2017 - May 2017):

  • Designed the user experience of the interactive voice interface and the corresponding App

  • Brainstormed ideas, created personas and storyboards and built low to high fidelity prototypes and wireframes using Sketch, Figma and Invision

  • Conducted different usability tests, including A/B testing, surveys, interviews, Wizard of Oz and so on.


UX Designer: Slim by Design -- iOS App Design

Team: Three Information Science students

My Role (Aug. 2016 - Dec. 2016):

  • Designed the user flow and interface of the iOS App using Sketch, Figma and InVision

  • Iterated the colors and icons of the visual design as well as modified searching and rating features of the App prototypes


Ergonomic Analysis on a Dynamic Foot Movement Device


News: Our paper is accepted to Human Factors and Ergonomics Conference 2018!

Effects of a dynamic foot movement device on cognitive performance in short-duration computer-based tasks.

My Role (Aug. 2017- Dec. 2017): 

  • Designed the ergonomic tasks to measure reaction time, short-term memory, precision and productivity

  • Analyzed the data from four cognitive tasks and self-report surveys and wrote the report

**Conference submission in progress** Feel free to contact me for details.

Data Visualization Class Projects

My Role (Jan. 2017 - May 2017):

  • Used d3 library, JavaScript, HTML and CSS to create interactive (left below) and static (right below) data visualization projects in a team of three students.