markokojicki/EMPATHYbuild

EMPATHYbuild

VR experience: testing the effectiveness of the metaverse and virtual reality in elevating user-centered design

Company

SAP

Tags

ExperienceResearchXR
UX Designer, Thesis student
2022-2023

EMPATHYbuild is a master's thesis project as part of my MA degree in Expanded Realities at the Darmstadt University of Applied Sciences. My master's thesis is written in collaboration with the SAP Innovation Center, where I was under a thesis student employment contract at the time.

I led the thesis delivery process from research, design, and project management perspective throughout the whole cycle, involving research, conception, prototyping, testing, and evaluation. For the prototype implementation, I worked alongside two engineers from my team at SAP, who helped me with parts of experience production in Unreal Engine, interaction backend, and speech recognition system, collaborating in a sprint setting.

About

The EMPATHYbuild project aimed to conceptualize, prototype, and test a VR experience, simulating a conversation between the user, who takes on the role of a designer or developer, and a User Persona. The User Persona is transformed from a textual 2D format into a full-fledged 3D MetaHuman character. The research project revolves around the central research question:

"To what extent can the Metaverse and Virtual Reality facilitate the creation of user personas, elevate empathy, and ultimately improve user-centered design?"

3D MetaHuman character used in the EMPATHYbuild VR experience

This project, while initially inspired by popular culture and prior experiences in the VR/MR medium, also heavily incorporates relevant academic research. With the primary goal of enhancing empathy during design processes, this experience aims to address one of the key problems:

Impersonality of user personas.

[...] We found that many design and UX practitioners in our study used them almost exclusively for communication and not design. These practitioners found personas too abstract and impersonal to use for design.

Matthews et al., p.1125

Research

EMPATHYbuild is built upon four research pillars, most of which are academic research papers on the topics mentioned below. At the same time, it adopts UX and market research approaches for the purpose of establishing a design-first approach when it comes to providing value to the experience.

I. User-Centered Design and User Personas

Academic

II. Virtual Reality and the Metaverse

Academic

III. Virtual Reality and Empathy

Academic

IV. EMPATHYbuild Project-specific

Media / UX
Research framework showing inspiration sources, toolstack, and UX components for the EMPATHYbuild project
πŸ‘€

SAP Customer Experience Map

Persona with who the user in the experience has a conversation with is based on an extensive global research study done at SAP, which maps main SAP business customers as personas:

Customer Experience Map structures all main customer tasks and information needs along their journey with SAP. It is based on extensive, global user research and it truly reflects the customer perspective. This ensures that we stay focused on what our customers need and shape their experience with SAP accordingly.

Dorothy IT Lead persona profile showing role, responsibilities, disposition, tech profile, drivers, frustrations, involvement in customer lifecycle, agenda, and quotes
πŸ“

Defining use cases β†’

After a small design thinking workshop which involved the main users/ stakeholders of the experience, I defined three different scenarios addressed by EMPATHYbuild.

UX/ Product Designer

To overcome problems of abstraction and impersonality of current user persona formats.

Product Manager

To get a clearer insight into the user needs for the sake of better product development aligned with UCD.

Software Developer

To get a clearer insight into the importance of solving specific problems within product implementation.

Prototyping

Creation of the VR experience required an interdisciplinary design/ production approach, where:

1

Dorothy, user persona from SAP's Customer Experience Map was put in a screen-written format in order to create a unifying narrative for different scenes, focusing on character development.

2

Design the experience environment (scene) based on who Dorothy is.

3

LiveLink app was used to create content and give voice and facial gestures to Dorothy.

4

Using Unreal Engine 5 and necessary SDKs, the scene was adapted to work for virtual reality.

5

Interaction and UI design - define necessary interactions, onboarding elements, implement speech as the main interaction pattern.

I used MetaHumans by Unreal Engine 5 in order to create Dorothy as a character in the experience and a suitable office scene with additional 3D models.

The process of adapting UP text-based information into a screen-based, interactive and story-driven character took place as part of a brainstorming process on creating an overarching story that Dorothy is a part of. I utilized my previous experience in the field of film and theater and developed a script that can be later discussed and refined with a voice actor who will bring Dorothy to life.

MetaHuman character creation process showing different views of Dorothy with detailed facial features, expressions, and full-body modeling
3D rendered office environment for Dorothy character showing different angles of a modern office with desks, chairs, and shelving
✏️

Storyboard/ interaction flow + wireframes

To visualize the experience and plan the interactions, I created storyboards and wireframes that mapped out the user journey through the VR environment and their interactions with Dorothy. This helped ensure a coherent narrative flow and identify potential usability issues before development.

Storyboard showing intro screen with Dorothy persona in idle state, explaining navigation elements and interaction triggers
Storyboard showing scenes 3 and 4 with Dorothy in her office environment, explaining user interactions and experience flow
Storyboard showing scenes 5 and 6 with monologue sequences, question prompts, and speech recognition interactions
πŸ§ͺ

UE5 LiveLink for content production

Face animations and audio which gives a voice and personality to Dorothy were done via LiveLink Face App, specifically created to capture face animations and use them for production in Unreal projects, either as recordings or in real-time. For this prototype, we focused on using the recording feature of LiveLink Face App.

Alongside face animations with data stored in CSV format, it also generates a file in a video format such as MP4, which then can be converted via third-party converter into an WAV format (audio file supported by Unreal Engine) and used as voice audio.

Face animations and voice were recorded by my friend and actress, Jess.

Side-by-side view of LiveLink Face App capturing facial expressions on iPhone (left) and the Unreal Engine interface showing the Dorothy MetaHuman character with applied facial animations (right)
🚧

Interaction backend

The skeleton for the whole prototype is implemented with Unreal's Blueprint Visual Scripting. The process of implementing the interaction flow began with creating a data table containing questions which the user will ask to Dorothy, as well as sequences (Dorothy's answers) triggered by the matching questions, determined based on the script.

Unreal Engine Blueprint system showing data table with conversation questions and visual scripting nodes for sequence playback and event handling
UI implementation showing Figma design (top) with widget components and the implemented VR interface (bottom) displaying the onboarding screen in the 3D environment
πŸ’…πŸ»

User interface

UI was implemented with Advanced VR Framework, as in-scene 2D screens with interactive buttons, initially designed in Figma. Advanced VR Framework is a plugin offering a variety of functionalities, one of them being a simplified way to build user interfaces for VR with interactive elements.

πŸ—£

Speech recognition system

This image illustrates the speech recognition process, where via button push on the controller, the user triggers voice recording, which is in real-time recorded as an audio file that goes through a few services, including Google's speech-to-text service which transforms the audio file to a text-based format which is then sent to SAP Conversational AI, with the ability to trigger the linked response from Dorothy (the user persona), who then replies to the user based on the recognized question from the set of questions that a user can read from the UI.

Speech recognition system diagram showing the flow between Unreal Engine, Python server, Google Cloud speech-to-text, and SAP Conversational AI
πŸ“½οΈ

Experience trailer/ Prototype demo

Testing

To evaluate the effectiveness of the EMPATHYbuild VR experience in enhancing empathy and improving user-centered design processes, a structured testing approach was developed. The testing plan focused on three key areas: conversation facilitation, VR as a platform, and empathy/relatability.

Testing plan diagram showing hypothesis, goals, testing areas (conversation facilitation, VR as platform, empathy/relatability), structure with 4 test subjects, and prototype technical details

Image outlines the initial plan for user testing.

πŸ”

Test insights

I. Facilitating ConversationII. Virtual Reality as a PlatformIII. Empathy/ Relatability

Cons

  • Problems with face/body animations, and lip sync (unnatural/unrealistic) - 4/4
  • No freedom to interrupt (unnatural) - 2/4

Cons

  • Couldn't stay focused on Dorothy all the time (curiosity about the surroundings) - 2/4
  • *Uncanny effect while talking to Dorothy - 4/4

Cons

  • "Instead of listening, better to observe performing certain actions β†’ deduce pain points"

Pros

  • Time between question asked and reply (realistic, "felt like she was thinking") - 3/4
  • Tone of voice in the conversation context (realistic) - 4/4
  • Sometimes face animations were on point (realistic) - 2/4

Pros

  • Felt comfortable speaking in VR ("earlier experience having effect") - 3/4
  • Speech interaction felt natural and intuitive ("given the immersion") - 3/4
  • Speech interaction great alternative tool to current interactions in VR - 2/4

Pros

  • Relate Dorothy to someone they know - 3/4
  • Relate to Dorothy directly - 3/4
  • Give clear opinion of who Dorothy is and her personality - 4/4
  • Understood Dorothy's pain points - 4/4
πŸ”¬

Impact vs effort

Impact vs Effort matrix showing 2D text-based user personas (low impact, low effort), current EMPATHYbuild prototype (high impact, high effort), and future EMPATHYbuild-alike experiences (high impact, medium effort) with a 'Sweet Spot' area

Based on the data synthesis, I crafted the impact vs effort matrix in the context of current user-persona creation tools, current EMPATHYbuild version, as well as the prediction of EMPATHYbuild type experiences in the future.

πŸ“‘ Future steps

Active vs. Passive engagementFacilitating a different learning experience where instead of passively listening, user is asked to be more observant and come up with their own deductions about the persona and their pain points.
Workshop-style creationDesigners and product team members could come closer to the UP they are creating, similarly to how actors oftentimes create deeper connections with the role they are playing.
Large Language Models/ AIGiving certain parameters that would define the language model's output, but allow users to ask questions at their own will, without limitations of predefined questions with attached replies.
πŸ“ berlin