UR5 Robot

An experiment in puppeteering a UR5 robot through the browser

I was invited to participate at a residency at The Studio for Creative Inquiry at Carnegie Mellon University during the Spring of 2019, where I received a grant to assist with the development of Handsfree.js. What you’re seeing in the video above is me mirroring my head movements on a Universal Robot.

After attaching a rubber head to the end of the arm, I used computer vision to detect my head pose (yaw, pitch, roll) from my browsers webcam using just JavaScript! I then loaded a small Python server on the robot itself that took the pose values and matched the rubber head’s pose using inverse kinematics. I had so much fun making this!


A JavaScript library for interacting with pages through face gestures

As I was working on NextCheckbox.com in 2018, I befriended an individual who was recovering from a severe stroke at a homeless shelter. The stroke had left him unable to effectively communicate with friends and family as he couldn’t speak clearly on the phone nor could he type or text. This inspired me to begin working on a “Face Pointer”.

Using computer vision, I started a library that took in a user’s head pose through the browser’s webcam and used that to place a pointer on the screen. The user could then smile, raise an eyebrow, or use any other custom gesture to activate clicks. The project received numerous awards, including a grant from Google PAIR, a residency at The Studio for Creative Inquiry Carnegie Mellon, School of AI Fellowship, a grant from Glitch.com, and support through GoFundMe.

Later versions of the library included support for body pose estimation and hand gesture recognition: