top of page

Through your eyes: A robot that mirrors how we interpret others through our own biased social lens

​Project conducted for the Master course "Final Master Project"

Duration: 5 months

Designer: Hanna Loschacoff

Contribution: Conceptualization, Interaction Design, Ideation of form and experience/expressivity, Improving sturdiness of mechanics and AI model integration (reactivity), Ideating and prototyping actuation expressivities, Gaze (computer vision) and radar sensing integration into data pipeline, C++ microcontroller software development, Manufacturing of aluminium stand, User studies, Expert interviews, Literature review

Meet a robot that reads social cues through radar and gaze tracking, reacting to perceived behavior. A live visualization reveals its reasoning, mirroring how cognitive biases shape interaction. This platform could explores human-robot interaction from multiple perspectives: first-person, third-person, each with different interpretations. Like our subjective views of others, there's no single objective truth. It invites reflection of how humans and machines construct meaning from ambiguous signals, and the invisible lenses through which we judge ourselves and others.

 

Click here for the GitHub repository

Final design

DSCF6195.jpg
DSCF6220.JPG

Final interaction loop

better_Interation_loop_iterations-05.png

Final stand

portInteration_loop_iterations-06.png

Process

Methodology: Research through Design

Making logs of integrating sensing (gaze and radar)

Designing live visualization

Screenshot 2025-12-18 183244.png

Ideating interaction loop

Interation_loop_iterations-08.jpg
Interation_loop_iterations-10.jpg

Interaction loop evolution

Designing and building the integrated aluminium stand

Screenshot (30) - Copy.png
Screenshot (29) - Copy.png
Screenshot (31).png

Form exploration

Screenshot (28).png
bottom of page