InMyFeels: MIT Reality Hackathon

Art direction for a virtual reality safe space that enables empathetic self expression

My Role

Experience Designer

My Scope

Responsible for project end-to-end; research, information architecture, prototyping & testing.

Results

Semi-finalist

The Team

1 UX Designer (me)
1 Sound Designer
3 Software Engineers

Context
THE HACKATHON

MIT Reality Hack

MIT Reality Hack is a community-run XR hackathon held at MIT where teams form, design, and pitch a fully functional immersive reality product in 2.5 days.

Semi-finalists out of 74 participating teams
THE ROLE

Experience Designer

As the sole experience designer, I wireframed and prototyped the experience on Figma. I designed the user interfaces and social network experience. I visualized and explained the experience by producing a 3 minute video.
THE TEAM

Cross-Functional

While at the hackathon, our team of 1 Experience Designer (Me), 2 software developers, 1 technical artist developer, and 1 sound designer formed together with the goal of creating a safe space to express emotions with using sound.
What does it do?
the experience

Analyze the emotion in your voice using machine learning

In My Feels is a emotional safe space where there is no right or wrong way to express emotions. After singing any melody that resonates with you into an orb, In My Feels uses machine learning to analyze the tone and detects our emotions. When emotion is detected, it changes color and floats to the cloud. The cloud is a collection of emotions from your community. You and your friends can upload multiple emotions daily, and a private log will help you reflect on your emotions over time.
Why did we make it?
OUR MISSION

Create an safe space for emotions

Growing up, we’ve always wanted a safe space that would welcome our emotions, whether they were negative or positive. We came together to create In My Feels to be in our feels, for real. Inspired by the positive effects of BeReal and the short-term effects of group singing versus listening on mood and state self-esteem, it's an authentic social media platform that encourages empathy and emotional support.
How did we build it?
using artificial intelligence

Analyze spectrometry graphs to analyze emotion for SER

The team used a Convolutional Neural Networks (CNN) model to analyze spectrometry graphs from WAV files to analyze emotion for SER. They used Uvicorn and Flask to create an API to call the trained model for evaluation on new WAV streams. The team developed audio-reactive "blobs" in the environment by using ShaderGraphs to control Vertex shaders, Normals, and Fragment shaders to create jelly-like holographic and colorful orbs. There were a lot of booleans in GameManager. I used Figma to prototype the user experience and visuals.
Design principles

Defining identity

We wanted our users to feel safe to express their personal feelings. As the experience of expressing one's emotions can feel vulnerable and trigger bodily sensations, we wanted create an environment of physical and emotional control in In My Feels — similar to the poetic feeling of shouting out into the ocean and the essence of being in deep sea. To explore this feeling deeper and create this environment, I set three key words that explain In My Feels.
Design System

Creating look and feel

To create a cohesive look and feel I drew inspiration from the concept of "message in a bottle", the vast ocean, and digital echos. For the orb, I looked into the textures of glass, transparency and enclosure. For the environment I looked into ocean landscapes. For the data analytics and machine learning, I looked into data visualizations that draws connections using clustering.
Multisensory design

Attribute color and whale sounds to emotion

To make the experience accessible to all abilities and more immersive, we attributed both color and sound scapes to the 8 emotions the CNN model can identify. We chose whale sounds for their beautiful voice and to create the feeling of being safe and protected underwater.
interaction design

Define states with spatial design and movement

We defined states needed and mapped them on a linear flow to figure out which movement would fit for each state.
Demo
View the virtual reality environment below.
Figma Slide Deck
Click through our slide deck by clicking on the embedded Figma preview below (or pressing the spacebar on your keyboard).