House of Desktop Dinning

OVERVIEW
The House of Desktop Dining is a digital restaurant to visit and eat with your computer. Automatically generating a table for two, the restaurant uses select visual and audio data from the top mukbang performers to prepare a set of personalized food pairings with your body’s posture and mouth sounds. 
How will this service affect human users’ eating behavior and emotions? This project is a reflection and speculation during when the pandemic and social distancing begins.


YEARTEAM AND ROLEIMPACTMay - Aug 2020Research Project Sponsored by Snapchat
Team Lead: Jenny Rodenhouse
My Contribution: UX, Machine Vision, Website Prototyping
0 - 1 Interactive Web Prototype
Live Exhibition in DDDD (July-Aug 2020)

Core Experiences



I. Restaurant Entrance 

Introduction to Machine as Dining Companion  


The entrance of the digital resteraunt  is a stylized video that establish a traditional korean mukbang vibe in connection to machine vision. This introduces users to



II. Eating with Computer 

A Working Prototype


The digital dining area features the interface of real time gesture recognition and sound recogntion. With trained models of eating gestures and food audio sound, the dining companion, the computer, will rate the users on how well they eat and drink according to top mukbang video standards.    


III. Restaraunt Exit

Unbox Digital Food Production


Users will leave the restaraunt with the visual process of machine learning on food. It unbox “how machine thinks” and tries to build understanding between the human and the machine.

Research and Prototype



RESEARCH QUESTION“MACHINES, LIKE PHONE AND COMPUTERS, ARE PROVIDING INCREASING EMOTIONAL SUPPORT AND VALUE TO THEIR USERS. HOW MACHINE VISION AS A NEW FORM OF A MEDIUM CHANGE USERS’ HABIT OF EATING?”


MUKBANG OBSERVATIONS

Is Mukbang helping people killing loneliness and why?

After watching top rated YouTube Mukbang videos, we found a visual composition trend of them having large appearance of head and food. Also, similar to the triggering effect as ASMR, mukbang videos are normally using triggering sound to help convey foods’ textures.







DATA COLLECTION AND TRAINING

How can machine perceive this kind activity to help it become a norm?

Machines cannot actually taste, look and listen. The activity is purely pixels and code. In order to train a machine to act as a companion, we collected mukbang visual and audio data were then collected to train an algorithm that eat with a human. 







IMAGE TRAINING IN RUNWAY ML



FRONT-END INTEGRATION AND TESTING

Integrate Traditional Korean Restaurant Design with Machine Vision