This project is a fully-functionality AR + Video application that lets users scan a room and composite a person into any virtual background, without a green-sceen, and all with just an iPhone. As the tech-founder of this company, my roles included the production, engineering, and general technical development. Basically my job was to make something that is currently impossible and figure out how to make it work. Learned a lot about people management and how to quickly pick up new softwares and programming languages for this project. The goal is the provide a completely mobile solution to achieve tracked compositing, with the ideal result achieving the same quality as one might on a typical set. Then anyone can simply record it as a video snippet and share it with friends to any social!