Through your eyes: A robot that mirrors how we interpret others through our own biased social lens
​Project conducted for the Master course "Final Master Project"
Duration: 5 months
Designer: Hanna Loschacoff
Contribution: Conceptualization, Interaction Design, Ideation of form and experience/expressivity, Improving sturdiness of mechanics and AI model integration (reactivity), Ideating and prototyping actuation expressivities, Gaze (computer vision) and radar sensing integration into data pipeline, C++ microcontroller software development, Manufacturing of aluminium stand, User studies, Expert interviews, Literature review
Meet a robot that reads social cues through radar and gaze tracking, reacting to perceived behavior. A live visualization reveals its reasoning, mirroring how cognitive biases shape interaction. This platform could explores human-robot interaction from multiple perspectives: first-person, third-person, each with different interpretations. Like our subjective views of others, there's no single objective truth. It invites reflection of how humans and machines construct meaning from ambiguous signals, and the invisible lenses through which we judge ourselves and others.
Final design



Final interaction loop

Final stand

Process
Methodology: Research through Design
Making logs of integrating sensing (gaze and radar)
Designing live visualization

Ideating interaction loop


Interaction loop evolution
Designing and building the integrated aluminium stand
%20-%20Copy.png)
%20-%20Copy.png)

.png)
Form exploration
.png)