Facial projection mapping

January 2017

In collaboration with Josh Kim, Anton Kuznetsov, and Hannah Tomio

 

 

Real life snapchat filters

Snapchat filters, or lenses, have become a ubiquitous part of social media. They are overlays that augment your face real time in the screen. 

Our project aimed to bring these to life by utilizing computer vision technology and a projector to map overlays directly over faces. 

 

HOW IT WORKS

Our setup uses two computers, a USB webcam, and an Epson EX3240 projector. The webcam is connected to one computer, which runs a facial detection program. The other computer is connected to the projector and runs a program that draws the mask (or filter) to be projected. Both programs are written in Javascript and make use of the clmtrackr and p5.js libraries respectively. To establish a connection between the two computers, we setup a web server programmed in Go running on the webcam computer. The facial detection program on the webcam computer sends the coordinates of detected facial features to the server, which then relays the data to the projector display program on the other computer, all over web sockets. The projector computer takes these coordinates, adjusts to compensate for camera-projector discrepancies, and draws a mask in the specified locations. The adjustment is done using a previously determined transformation matrix as part of the calibration process.