Motion & Emotion: A robot that reads and mirrors your emotions
Project conducted for the Master Industrial Design at TU Eindhoven
Duration: 5 months
Designer: Hanna Loschacoff
Contribution: Conceptualization, Interaction Design, Ideation of form and experience/expressivity, Mechanical prototyping, Material explorations using digital fabrication and narrative building connected to academic research in HCI, Exploratory Sketching, Integration of Foundational Model (Computer Vision/AI), Software & Hardware development
​
Exhibited at: Maker Days (2025) Company collaboration: Bureau Moeilijke Dingen ​​


This project asks “What if machines could sense and mirror our emotions?” Using a computer vision AI model, a shape-changing interface detects emotional shifts and mirrors them through movement. By making emotional interaction with technology physical, it reveals new ways for humans to connect with machines through shared emotional cues. Drawing on the concept of emotion contagion (absorbing each other’s emotions) it makes this invisible process visible, highlighting the mutual emotional effect between human and machine and questioning what it means to be emotionally seen.
Final design

.avif)

Interaction Loop and Expressivity

Motion responses per emotions

Process
01
Methodology: Material Driven Design (Research through Design)
Starting point: Combining shape changing interfaces with physical AI (Computer Vision)
Material explorations of shape changing interfaces




Training my own image recognition AI model using Teachable Machine

Above: Train an image recognition model on the states of a shape changing interface
02
Mood board inspiration of shape changing interfaces


Selecting a material and interaction modality for the shape changing interface and exploring its action possibilities
Mapping emotion expressions to movement types

03
Iteration 1 of interactive prototype: senses location of user and depending on distance a specific movement happens (the closer the more agitated)


04
Iteration 2 of interactive prototype: read emotions using AI model (run on virtual machine) and has corresponding movement which mirrors that emotion


05
Iteration 3 of interactive prototype: creating sturdy mechanisms to increase tension of strings pulling shape changing material to result in visible movement. Using a current sensor to sense state/position of prototype to not exceed and damage material.



.jpeg)
Part 2: Master Thesis
Currently working on my Master Thesis, expanding on this project by using gaze direction detection and radar sensors to read more non verbal cues.
In short: Everybody has a different "lens" in how they read social cues (specifically non verbal cues). I want to show how we often misinterpret other peoples cues and how it impacts social interaction, by stimulating people to reflect on their cognitive biases and their "lens".
I will do so through an installation consisting of a interactive robot and data visualization on a monitor.
Robot: Through interaction with the robot the user's behavior will be analyzed and the robot will easily feel rejected (expressing this through movement). To understand why the robot is behaving this way, the visitor can visit the data visualization area.
Data visualization: The visualization reveals the thought process/lens of the robot in reading social cues of the person its interacting with and showcases the sensor data. Sparking reflection on how we interpret rejection and make assumptions about others. Using technology meant to track us, reimagined for self-understanding.
Small sneak peak of first draft version of data visualization