I was invited to participate at a residency at The Studio for Creative Inquiry at Carnegie Mellon University during the Spring of 2019, where I received a grant to assist with the development of Handsfree.js. What you’re seeing in the video above is me mirroring my head movements on a Universal Robot.

After attaching a rubber head to the end of the arm, I used computer vision to detect my head pose (yaw, pitch, roll) from my browsers webcam using just JavaScript! I then loaded a small Python server on the robot itself that took the pose values and matched the rubber head’s pose using inverse kinematics. I had so much fun making this!