At Imaginology I worked with Nathan Greene to pioneer a live digital puppetry performance. We created a pipeline used for bringing realworld puppets into the digital world for live performances. I worked as a 3D artist, importing 3D scans, repairing meshes, texturing, and rigging the digital doubles for use in live broadcasts.
Nathan talks about our process in the first video, but we used a Microsoft Kinect, Autodesk MotionBuilder, and some beta software to sync the two. Overall it was an excellent prototype, and we found some potential future uses with projection mapping and theatre performances. For a more in depth breakdown, watch the first video linked below.