Motion & Emotion: A robot that reads and mirrors your emotions
Project conducted for the Master Industrial Design at TU Eindhoven
Duration: 5 months
Designer: Hanna Loschacoff
Contribution: Conceptualization, Interaction Design, Ideation of form and experience/expressivity, Mechanical prototyping, Material explorations using digital fabrication and narrative building connected to academic research in HCI, Exploratory Sketching, Integration of Foundational Model (Computer Vision/AI), Software & Hardware development
​
Exhibited at: Maker Days (2025) Company collaboration: Bureau Moeilijke Dingen ​​


This project asks “What if machines could sense and mirror our emotions?” Using a computer vision AI model, a shape-changing interface detects emotional shifts and mirrors them through movement. By making emotional interaction with technology physical, it reveals new ways for humans to connect with machines through shared emotional cues. Drawing on the concept of emotion contagion (absorbing each other’s emotions) it makes this invisible process visible, highlighting the mutual emotional effect between human and machine and questioning what it means to be emotionally seen.
Final design

.avif)

Interaction Loop and Expressivity

Motion responses per emotions

Process
01
Methodology: Material Driven Design (Research through Design)
Starting point: Combining shape changing interfaces with physical AI (Computer Vision)
Material explorations of shape changing interfaces




Training my own image recognition AI model using Teachable Machine

Above: Train an image recognition model on the states of a shape changing interface
02
Mood board inspiration of shape changing interfaces


Selecting a material and interaction modality for the shape changing interface and exploring its action possibilities
Mapping emotion expressions to movement types

03
Iteration 1 of interactive prototype: senses location of user and depending on distance a specific movement happens (the closer the more agitated)


04
Iteration 2 of interactive prototype: read emotions using AI model (run on virtual machine) and has corresponding movement which mirrors that emotion


05
Iteration 3 of interactive prototype: creating sturdy mechanisms to increase tension of strings pulling shape changing material to result in visible movement. Using a current sensor to sense state/position of prototype to not exceed and damage material.



.jpeg)