top of page

Through your eyes
A robot that mirrors how we interpret others through our own biased social lens

Meet a robot that reads social cues through radar and gaze tracking, reacting to perceived behavior. A live visualization reveals its reasoning, mirroring how cognitive biases shape interaction. This platform could explores human-robot interaction from multiple perspectives: first-person, third-person, each with different interpretations. Like our subjective views of others, there's no single objective truth. It invites reflection of how humans and machines construct meaning from ambiguous signals, and the invisible lenses through which we judge ourselves and others.

 

Click here for the GitHub repository

​Project conducted for the Master Thesis

Duration: 5 months

Designer: Hanna Loschacoff

Technology: 

  • ESP32 microcontroller + 24 GHz radar sensor (serial communication)

  • Servo motor control (physical shape-changing response)

  • Node.js WebSocket server (real-time browser ↔ hardware communication)

  • WebGazer.js (open-source browser-based gaze tracking)

  • HTML + JavaScript (React Framework, data processing & real-time visualization)

  • Local hosting via Live Server (VS Code)

  • Privacy: fully local, ephemeral, no data persistence

Final design

Edited_wix_throgh_your_eyes-7.jpg
DSCF6220.JPG

Final interaction loop

better_Interation_loop_iterations-05.png

Final stand

portInteration_loop_iterations-06.png

Data pipeline

Process

Methodology: Research through Design

Making logs of integrating sensing (gaze and radar)

Designing live visualization

Screenshot 2025-12-18 183244.png

Ideating interaction loop

Interation_loop_iterations-08.jpg
Interation_loop_iterations-10.jpg

Interaction loop evolution

Designing and building the integrated aluminium stand

Screenshot (30) - Copy.png
Screenshot (29) - Copy.png
Screenshot (31).png

Form exploration

Screenshot (28).png
bottom of page