top of page

Motion & Emotion: A robot that reads and mirrors your emotions

Project conducted for the Master Industrial Design at TU Eindhoven 

Duration: 5 months

Designer: Hanna Loschacoff

Contribution: Conceptualization, Interaction Design, Ideation of form and experience/expressivity, Mechanical prototyping, Material explorations using digital fabrication and narrative building connected to academic research in HCI, Exploratory Sketching, Integration of Foundational Model (Computer Vision/AI), Software & Hardware development

​

Exhibited at: Maker Days (2025)                                                                              Company collaboration: Bureau Moeilijke Dingen                               ​​

cropped-logo-voor-handtekeningen-los-300x183.png
image.png

This project asks “What if machines could sense and mirror our emotions?” Using a computer vision AI model, a shape-changing interface detects emotional shifts and mirrors them through movement. By making emotional interaction with technology physical, it reveals new ways for humans to connect with machines through shared emotional cues. Drawing on the concept of emotion contagion (absorbing each other’s emotions) it makes this invisible process visible, highlighting the mutual emotional effect between human and machine and questioning what it means to be emotionally seen.

Final design

P-1_1749327119204-1920.avif
P-1_1749327119192-1920 (1).avif
motion_emotion.avif

Interaction Loop and Expressivity

P-1_1749328693145-1920.avif

Motion responses per emotions

P-1_1749328631391-1920.avif

Process

01

Methodology: Material Driven Design (Research through Design)
Starting point: Combining shape changing interfaces with physical AI (Computer Vision)

Material explorations of shape changing interfaces

Training my own image recognition AI model using Teachable Machine

Teachable_Machine_4.JPG

Above: Train an image recognition model on the states of a shape changing interface

02

Mood board inspiration of shape changing interfaces

inspo.png
image.png

Selecting a material and interaction modality for the shape changing interface and exploring its action possibilities

Mapping emotion expressions to movement types

Experiments_dimensions-05.png

03

Iteration 1 of interactive prototype: senses location of user and depending on distance a specific movement happens (the closer the more agitated)

P1130343.JPG
P1130359.JPG

04

Iteration 2 of interactive prototype: read emotions using AI model (run on virtual machine) and has corresponding movement which mirrors that emotion

DSC_0513.JPG
DSC_0524.JPG

05

Iteration 3 of interactive prototype: creating sturdy mechanisms to increase tension of strings pulling shape changing material to result in visible movement. Using a current sensor to sense state/position of prototype to not exceed and damage material.

P1130366.JPG
P1130333.JPG
DSC_0544.JPG
WhatsApp Image 2025-06-08 at 17.24.34 (15).jpeg

Part 2: Master Thesis

Currently working on my Master Thesis, expanding on this project by using gaze direction detection and radar sensors to read more non verbal cues. 

In short:  Everybody has a different "lens" in how they read social cues (specifically non verbal cues). I want to show how we often misinterpret other peoples cues and how it impacts social interaction, by stimulating people to reflect on their cognitive biases and their "lens".
I will do so through an installation consisting of a interactive robot and data visualization on a monitor.
Robot:  Through interaction with the robot the user's behavior will be analyzed and the robot will easily feel rejected (expressing this through movement). To understand why the robot is behaving this way, the visitor can visit the data visualization area.  
Data visualization:  The visualization reveals the thought process/lens of the robot in reading social cues of the person its interacting with and showcases the sensor data. Sparking reflection on how we interpret rejection and make assumptions about others. Using technology meant to track us, reimagined for self-understanding.

 

Small sneak peak of first draft version of data visualization

bottom of page