BHCI Capstone

DoryVR
Storytelling with Data in VR

Virtual Reality
MetaQuest 3
Role
UX Designer
Timeline
4 months
(Jan - May '24)
Tools
Figma, FigJam, Adobe CC, Blendr, MetaQuest 3
Team Members
Tianshu Huang, Menghan Liu, Sunniva Liu, Anishwar Tirupathur 

01. OVERVIEW

UX Designer

I created the user interface for the VR program’s client-side software. I applied my Architecture background to design the user interactions and layout of the 3D Virtual Reality presentation room. I led the production of multiple working prototypes and engaged in user research, using direct feedback to guide new iterations.

Role

Dr. Kim Hyatt
Professor
at Carnegie Mellon University
Heinz College of Information Systems and Public Policy

Dr. Kim Hyatt is the professor for the Heinz College course “Communication in VR”, which teaches Master’s students how to give presentations about data in virtual reality (VR).

Client

The objective of the college course “Communication in VR” is to teach students how to present data in Virtual Reality.

Dr. Hyatt is using Spatial, a VR platform, to host her classes. Students attend class using VR headsets (Oculus 3). Currently, Spatial does not allow students to import or present custom 3D media or datasets.

Students primarily rely on slideshows and 2-dimensional screenshots of data as supplemental materials. However, representing 3-dimensional data in 2 dimensions often produces visual clutter and information overload. In addition, students also found 2D data less interesting to look at, and were therefore less engaged. 

Context

PROBLEM

Spatial does not offer a way to view data or media in 3D.

Class presentations feel too traditional and “2D”. Students are unable to aquire skills on how to present 3D data in Virtual Reality.

GOAL

To create a custom VR environment for peer-to-peer presentation of a diverse range of media types, including custom 3D data.

To familiarize students with VR technology, and enhance the data storytelling experience.  

02. RESEARCH

Questions

Our team came up with research questions that would help hone our project focus.

RESEARCH QUESTION #1

How do we make the process behind learning to storytell with data in VR more involved for students?

RESEARCH QUESTION #2

What is the process behind learning how to communicate in VR like? What are some common behaviors that occur?

To answer the research questions, our team compiled the following:

Literature Review on education and data visualization in VR

Competitive Analysis of four VR data visualization tools to gauge what the climate of data storytelling in VR is currently like

2 classroom analyses where we shadowed both students and instructors to gain a holistic understanding of the education process

5 follow-up contextual inquiries with students to learn about their behavioral patterns, roadblocks, and overarching goals for VR presentation

5 interviews with VR experts where we asked questions about the state of XR (Extended Reality)

Methods

Clasroom observation

Classroom Observation

Students are presenting a slideshow in a VR space, while the professor watches and critiques.

Mapping

Based on our research, our team created an affinity diagram and empathy map.

The affinity diagram helped us discover overarching themes and pain points amongst our users, which include:

pseudo-physical challenges unique to VR

educational challenges unique to VR

new behavioral norms in VR

limitations that prevent robust use of VR

advantages of VR over traditional methods

The empathy map helped us find unique behaviors and pain points that occur in a VR environment.

Affinity Diagram
Empathy Map

Insights

INSIGHT #1

XR creates a new environmental dimension of interaction

"For example, walking through the galaxy, or exploring the intricacies of the human heart... and having a multi-sensory experience."
INSIGHT #2

XR limits accessibility and nonverbal communication

Students face behavioral barriers transitioning from traditional classrooms to XR. Avatars in VR cannot use eye contact, facial expressions, or body language, which restricts communication.

INSIGHT #3

Presentation in XR adds flexibility and alleviates social anxiety

Presenting through a digital medium reduces speakers' nerves. Students can join class remotely, and access private supplemental material to help them present.

INSIGHT #4

Due to its novelty, there are technical limitations which prevent VR from being used robustly

Students experience glitches and technological limits using VR, as certain features that bridge the reality and virtual world have not been integrated yet.

INSIGHT #5

AR (Augmented Reality) can be used to create more nuanced communication spaces than VR alone

The combination of AR and VR features can create a more robust learning environment.

Brainstorming

We used sticky notes on a virtual wall to discuss and compare ideas, identifying the viability of proposed solutions.

We hosted a design charrette session to narrow our focus on which key features to develop.

Key Features

Level of immersion
Change the environment from VR to AR; users can hybridrize their data storytelling experience
Body Tracking Avatars
Virtual avatars that track and emulate your body language and facial expressions
Item Visibility
Control visibility of a presenter’s object or other digital item
AI Image Placement
Use AI to automatically orient presentation objects towards audience members and presenter
Pointer Tool
Virtual cursor that helps direct attention for a presentation or collaborative assignment
Individual Dashboard
Virtual tablet that helps student present, receive feedback, and direct audience attention
Onboarding Video
Tutorial video to help users overcome VR learning curve
Data Presentation Table
Object in XR scene that presenters can use to present 3d data or models

User Flow


03. SYNTHESIS

User Tests & Prototypes

We conducted multiple think-aloud tests with a total of 12 students in the “Communication in VR” class throughout our product development.

Feature 1: Individual Dashboard + Pointer

STORYBOARD #1
Individual Dashboard + Pointer

Presenters have a private dashboard for lecture notes, next slides, and a timer, ensuring no distractions for the audience. Audience members can also view a spectator's version of the private dashboard in case their view of the presenter is obstructed. Speakers can use a laser pointer to help guide audience attention.

LOFI PROTOTYPE + USER TEST

QUESTION: Should the private dashboard be tablet-sized and appear in front of the speaker, or SmartBoard-sized and appear in the back of the room?

FEEDBACK: Presenter dashboards in the back of the room could more efficiently address focus, while individual dashboards for the audience can better track content
MIDFI PROTOTYPE + USER TEST

QUESTION: How will users interact with the features we offer in the design of the individual dashboard for presenters?

FEEDBACK: While users preferred the midfi iteration, the unlabeled icon buttons were confusing. The confusion about button meanings impacted the effectivity of their intended usage.

Feature 2: Levels of Immersion

STORYBOARD #2
Adjustable Immersion Levels

Users have control over if they want to see a mix of their real surroundings and virtual elements (AR), or be completely immersed in the VR world. This mixed reality environment uses both AR and VR together.

LOFI PROTOTYPE + USER TEST

QUESTION: Is this something that users believe is useful and applicable to their classroom setting?

FEEDBACK: Adjusting immersion can be useful for reducing distractions that are in AR/VR environments (eg: turn up immersion if roommate is home, turn down immersion if VR background is distracting).
MIDFI PROTOTYPE + USER TEST

QUESTION: Does effectiveness of the tool translate through the simple, slider-based UI? Is this something students see themselves using during presentations?

FEEDBACK: Users desired better clarity regarding the best ways to use the "Levels of Immersion" feature during active presentation.

Feature 3: Interactive Scalable Data

STORYBOARD #3
Interactive Scalable Data

Speakers have a designated table for presenting and interacting with 3D data, which they can scale up to fill whole room. Supplemental 2D slides are displayed behind the speaker for simultaenous presentation of traditional and 3D data.

LOFI PROTOTYPE + USER TEST

QUESTION: Should the 3D data appear on a presenter's table at the front of the room, emulating a traditional classroom, or should it be scaled up to fill the room so the audience can "walk" through it?

FEEDBACK: Users enjoy walking inside of the scaled up 3D data in the storyboard, but they dislike it in LoFi due to the room being too small, bars too cluttered, and issues with people blocking each other.
MIDFI PROTOTYPE + USER TEST

QUESTION: Do the updates to the design of the presentation room create a more intuitive user experience?

FEEDBACK: Users enjoy the architectural updates to the room. They believe our idea could be more fleshed out if we included more detail, such as labels on the data. In the first version, the presentation table is too far away. The scaled up version is more playful and interactive.

Feature 4: Presentation Recording for Feedback + Self-Reflection

STORYBOARD #4
Presentation Recording for Feedback + Self-Reflection

Speakers can record themselves presenting from both their perspective and the audience's perspective. The autogenerated transcript produced from the recording can be commented on for feedback.

LOFI PROTOTYPE + USER TEST

QUESTION: Should the feedback be specific to certain points in the transcript, or should summarize the presentation as a whole?

FEEDBACK: Time-specific critique provides more useful information. Users want feedback on gestures, wording, speed, audience engagement, content feedback.
MIDFI PROTOTYPE + USER TEST

QUESTION: Does the timestamped feedback in the recording dashboard help users self-improve?

FEEDBACK: Time-specific critique provides more useful information. Users want feedback on gestures, wording, speed, audience engagement, content feedback.

04. SOLUTION

DoryVR is a collaborative Extended Reality (AR + VR) program that allows users to present both 2D and 3D data

Demo Video ⤵️

Final Report ⤵️

CORE FEATURE #1

Individual Dashboard + Laser Pointer

Presenters and audience members can access a private tablet that provides supplemental content.

Presenter's Dashboard

Presenters can access slide previews, speaker notes, a timer, and a recording feature.

The speaker can pull data and other media directly from this dashboard into the presentation space. They can also change the way the data is displayed by toggling between the data view modes.

Audience's Dashboard

Audience members can access a view-only version of the presenter's dashboard.

Using the dashboard, the audience can read slides and see other presentation materials at their own pace, as well as take notes.

Presenter's Laser Pointer

Speakers can toggle a laser pointer in order to improve the UX behind using presentation features, as well as to better direct the attention of the audience.

CORE FEATURE #3

Interactive, Scalable 3D Data

Users can display, scale, and interact with data in both 2D and 3D, allowing for the effective communication and presentation of multidimensional media.

Interactive 3D Data

Users can load 3D data into the presentation space. They can interact with these data displays in ways that increase clarity while presenting.

For example, users can highlight bar graphs and scatterplot points.

Scalable Data

Users can scale the data up so that it takes up the entire presentation space. Audience members can "walk through" this data in VR.

Therefore, users can take advantage of a new dimension of space and scale as a means of communication.

CORE FEATURE #2

Adjustable Levels of Immersion

Users can adjust the level of immersion between AR and VR for a less distracting user experience.

The level of immersion can be adjusted with a slider so that users can choose between Augmented and Virtual Reality. AR allows users to view data presented against their real life background, while VR is fully immersive.

Adjusting the Level of Immersion

Augmented Reality

05. FUTURE STEPS

Due to time limitations, our group was not able to program a fully functioning program with comprehensive integration in VR. If we had more time, this would be the next step.

Building an independent virtual reality program brings its own UX challenges, including…

Building a custom VR classroom environment

Our research showed us that surreal backgrounds may increase classroom happiness and memory retention. I would love to apply my architecture background, and use the limitless possibilities available in VR to craft custom classroom environments to facilitate better learning.

Integrating AI to help improve accurate data communication

As AI robustness quickly improves, I would love to explore the the way generative content can help give feedback to students’ presentations.

Our avatar standing in our custom designed environment, created in Minecraft and used during our Mid-Fi user tests

06. REFLECTION

The limits of Virtual Reality are endless – anything you can imagine, you can create. Tackling a project with an entirely new and completely customizable spatial dimension presented unique challenges, but it also expanded my definition about the scopes of UX design.

It was especially rewarding applying both my UX and Architecture background to the design process. Understanding the relationship between the user and their greater spatial environment helped me create a user flow and UI in 3-Dimensional space. I enjoyed exploring the relationship between the user and their greater spatial environment, and understanding the ways which their psychology is influenced. For example, the VR classrooms that the students were using were very vast in scale: they had high ceilings and open concept spaces. Students described this novel environment as more attention-capturing than traditional classrooms, and thus felt more engaged in the learning. 

One of the most fascinating facets of this project was navigating what I call the “2.5D” nature of VR. While users can see and perceive spaces in 3 dimensions, they can not physically interact with them. Designing for this unique intersection required trust in the iterative process, and guidance via user feedback, which often pushed me to rethink my assumptions and uncover more intuitive solutions. 

This project has expanded my perspective on UX design, and reminded me why I’m so passionate about this field – it’s a constantly evolving space where technology pushes the scope of human experiences. It was a privilege to be able to work with novel, innovative tech such as the VR headsets, courtesy of the CMU HCII lab. I’m excited to continue exploring how UX and novel tech can redefine the way people approach challenges.